Anyone using Backtrader for live trading
created the pull request in mementum/backtrader
Changes in ibstore.py to get CFD market data. #240
you may include in next release after review.
@skytrading54 It seems it's all about managing a
CFDproduct, like if it were a
CASHproduct. Is that all?
@backtrader yes that's right.
I am facing a delay issue in processing updates for IB when using multiple products having different market timings (meaning for some of them market is open and for some its not.)
eg: IBAU200, USO, IBUS100, Emini future.
while running this during US market hours while for 3 of them has market open but not Australia, the historical data is downloaded first 3 very quickly for 4th one is stuck somewhere and so the next in strategy is happening very slowly it takes up to 10 minutes to have next completed for historical data 1 minute compression/timeframe. Same is applicable if I run this Australia timing as US is not open by then.
If I remove IBAU200 (and add many other US products) it works fine (all next calls are done for historical data within few seconds). Wander what could be the issue.
I tried passing session start / session end time while creating each of the data feed, its still the same issue.
It seems something to with backtrader being waiting for response for first real time update of IBAU200 so that it can subscribe for historical data for the same which does not happen and so there is some sort of looping.
Any idea how this could be fixed ... if resorted this will help run one program across multiple market...
This is an aspect which is being worked on (worked on as in right now) There are several factors in play here.
backtrader grew up from being purely meant for backtesting into adding datas which can keep on pumping. Things grew on top of the existing code.
Most of the initial use cases had to deal with
minutesa very large timeframe apparently.
There is suddenly, so to say, a 2nd generation of use cases which all involve large timeframes, waiting for the end of day to kick in to issue an order.
The problem arises because the larger timeframe goes live a lot sooner than the smaller timeframe, and the larger timeframe tries to wait for more incoming data to see if it can deliver. This makes sense in the case of
replay, because the larger timeframe even if not complete can deliver and waiting seems (actually seemed) like an acceptable compromise.
But if the larger timeframe is being
resampledit no longer makes sense to wait, because it will only be complete when it is complete (sessionend, timeframe/compression boundary, ...) and not any minute before.
Let's not forget to offer a solution to the problem at hand. Use the parameter
qcheckfor the larger timeframes.
qcheck=0.05for example (the default is
0.5which is half a second) will reduce the lag during the historical shootout by a factor of 10.
Of course, the internal loop checking for a resampled data to deliver will tick faster (this will invisible to the end user, but keeps the CPU a lot more often active. Unless you CPU is really limited, it should make a huge difference to start with)
And ... the
developmentbranch contains code to alleviate the situation. It will not only work during the initial backfilling stage, but during any stage, i.e.: if not all feeds have the
LIVEstatus, there will no waiting, because some other feeds can (or will when the download is complete) produce data from historical sources.
Surprisingly, in any case, the new code works against the faux data which is delivered even during the weekends, but not against a regular account, which would get only historical data now ...
Thanks backtrader will check with qchek parameter on the weekday when data is available in paper account and will take dev branch once done.
developmentbranch may still undergo a couple of iterations. The new code works, but has the side effect of spinning up the CPU because there is no waiting time. A couple of extra use cases will make it.
qcheckeffect can be done with the demo, which runs continuously even during the weekend.
newtrader last edited by
@backtrader I just recently started using backtrader (this is such a great learning experience) for backtesting. I wanted to use my strategy to place orders on a Bitcoin exchange. I have a pipe producing the data and the backtesting runs successfully using that same data.
But, where exactly should I place the code to make the calls to brokers api?
if self.dataclose < self.dataclose[-1]:
# current close less than previous close
if self.dataclose[-1] < self.dataclose[-2]: # previous close less than the previous close # BUY, BUY, BUY!!! (with default parameters) self.log('BUY CREATE, %.2f' % self.dataclose) # Keep track of the created order to avoid a 2nd order self.order = self.buy() # Should this be replaced by e.g self.order = my_api.sell() ???
You have to tell the system to which broker to connect. The API in the strategy doesn't change (or else it wouldn't make sense)
See for example Docs - Interactive Brokers
The same concepts apply to the other two live brokers, but basically withe the
Storepattern (which is preferred)
... store = bt.stores.IBStore(**myparams*) # like port to connect to broker = store.getbroker(**brokerargs) # if any is needed cerebro.broker = broker ...
You probably also want to use a data from the same service with
newtrader last edited by
@backtrader Thank you for the quick reply! I did see the this page where it leverages IBBroker as a store. But as I am trying to connect to a Exchange say Bitfinex, is there documentation i can create for custom Broker?
There isn't a documentation for creating a broker because each broker has a different way of doing things. For example: Interactive Brokers has bracket orders (A group of 3 orders in which 2 of them bracket the 3rd in the middle) whereas Oanda has bracketing support (A single order with 3 possible prices, in which 2 of them bracket the price in the middle)
Some approaches were tried along the way and the best approach in any case would be:
Not to create a specific
Orderobject even if subclassed from
Use a standard
Orderobject and create broker specific order dicts, lists and other things inside the broker code
Storepattern to avoid direct instantiation of specific
Dataclasses. This also helps to keep common parameters (like for example host for the connection) unified
Use the code from
Oandaas a basis.
In general you will need:
- Background threads to process data and broker events
- Synchronized queues to received the events from the background threads
- A mapping between the data events (like not being able to access a data feed due to permissions
-> NOTSUBSCRIBED) to the events defined in the backtrader classes
- A mapping also for some broker events. Some brokers report expired orders as cancelled. It may not really be that important at the end, but if it's important for you, you should be able to discriminate.