It seems you would need to define your own data feed. The docs have a chapter for binary feed development.
The docs contain also a BlazeData which is nothing but a connection to an iterator (cursor in database jargon). If your database provides an iterator, you should be all set.
@backtrader Hey just came across this searching for info. How do you mean 'no trade.ref' as it's in the trade object. Thanks
ref:3 <-- here
data:<backtrader.feeds.csvgeneric.GenericCSVData object at 0x1032cf438>
@KT said in Broker with different multiplier for different dataset:
I would like to check how to set broker with unique multiplier for each datasets.
Use different CommissionInfo customized (with the desired multiplier and settings) instances for each asset.
@mpskowron May be that this helps: https://www.backtrader.com/docu/mixing-timeframes/indicators-mixing-timeframes.html
I'm not sure if it is applicable also to length datas, but it is designed for timeframes of different length
First you will need to load all data feeds for all stocks into bt. You may want to parse your dictionary and compile this list of all stocks and then load data feeds into bt.
Then every time you do your calculations in the next() to issue buy order you need to check if the stock is included in your original stock_list_# on that date.
Thank you for the post! Currently I am looking on the testing couple ideas for american futures. The way that I think to go forward is to use back-adjusted prices and have continuous contacts. I've checked Quandl already and their continuous contracts are not suitable for me. I was thinking to chain data series based on max volume to match reality, so in the continuous contract I have prices only from the months with max volumes, but Quandl just chains contracts based on expiration months.
backtrader makes continuous contracts based on certain conditions (date and volume), but does not shift prices. As a result you will have gaps between contracts. So to make this shift ones need to re-write bt rollover method or just write a strategy which will make it and save prices into csv files. I am not a super-python guy, so I will probably go second way.
I don't think that adjusted prices in the past will harm the tests. I am going to use dollar value stats, no % stats. Also trades price delta will be the same as for real prices.
@backtrader said in Settings lines and params on __init__:
Not automatically. The reason to push daily timeframes to the limit of the day is to avoid them being overtaken by lower resolution timeframes. Because if the time of the daily resolution is 00:00:00 and is put in the same scheme as a data feed with minute resolution, a time 00:00:01 in the minute resolution data feed will surpass the daily resolution, which cannot happen in real life.
Correction: use the sessionend=datetime.time(hh, mm, ss, us) parameter when instantiating a data feed and that will be the end of session.
There must be something missing in the question, because the datas array can always be accessed inside a Strategy. It's the core of the idea.
getdata from the store only gives you the feed, you add it to the system with addata (or reworked through resampledata for example`) and it will be there.
The Oanda data feeds behave just like any other (and can be mixed with other data feeds too)
Hi Backtrader, I've had all sorts of issues with IB with my manual trading. I was going to use them for running a strategy, that makes 1 or 2 trades a day on an ETF using Backtrader. Before I invest time into this, would you recommend I use another broker instead? Thanks
I managed to overcome this with the chrome extension "High Contrast":
Re: Getting executed twice on closing orders
Ah, I am an idiot. Thanks for your help I had a datafeed issue. Thx!
Working code below.
if __name__ == '__main__':
#Start the brain
cerebro = bt.Cerebro()
# Load Strat
#Start cash level
#Set multiplier and commish level
cerebro.broker.setcommission(commission=0.79, margin=3000.0, mult=1000.0)
df = pd.DataFrame()
df = bck.df()
#Add Dataframe from other module
data = bt.feeds.PandasData(dataname=df,timeframe=bt.TimeFrame.Minutes,compression=1)
#Load data to brain
print('Starting Portfolio Value: %.2f' % cerebro.broker.getvalue())
results = cerebro.run()
print('Final Portfolio Value: %.2f' % cerebro.broker.getvalue())
Solved. Thank you so much for the really quick reply!
Here is the code for reference for anyone else.
self.log('SELL EXECUTED, Price: %.2f, Cost: %.2f, Comm %.2f, Datetime: %s' %
Because the data feed IBData registers itself in the store overwriting DataCls. This would allow anyone to write a different version of IBData (call it MyIBData), which would again overwrite the value of DataCls in the store.
And consequently, getdata would return an instance of MyIBData