Replies: 1 comment
-
I would be interested to know if you explored this and if you used nautilus or not. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi Team,
I really like the architecture and principles used in Nautilus. I admit I have not tried working with anything yet, I am still evaluating to see if this project would work for my use case. I trade a large universe of stocks using minute bars in backtest and live. When backtesting I use the universe of all stocks traded on Nasdaq and NYSE. 5k tickers any day, about 8k+ tickers in a backtest over a few years. I am currently using Amibroker, a vector based platform that can process a few years of 8k symbols with minutely data very fast.
For live trading I port my strategy over to python(this is one thing I would love to solve, the backtest vs live parity). I do not need to scan all 5k stocks live as that would be too much data to process. The list gets filtered down to about 100-200 tickers at any one time that would be receiving minutely real-time data(currently streaming from polygon.io websocket).
I know there are a lot more details to be considered, but was wondering from the community and people experienced with the platform if this seems feasible in Nautilus(with custom adapters of course)? Is the platform built for large universes? Most of the samples are for single or at most two instruments per strategy so wanted to know if I would be pushing the platform in unintended directions?
Thanks!
ps. I would like to add I watched all 3 parts of the webinar on YouTube and browsed the code, questions and forum on GitHub. Not trying to make anyone do all the work for me!
Beta Was this translation helpful? Give feedback.
All reactions