Inquiry into speeding up code
-
Hello,
Very new to backtrader. Thanks, it is wonderful and I am already performing a serious backtest in less than 3 days!
Quick question, I am running a strategy that has a lot of filters on a daily time frame, and then runs a strategy on one minute data. It seems to be very slow because you have to add the minute data first, and then _next is going minute by minute.
I was wondering if any of the more experienced devs on this forum would have an idea on how to resolve this problem because I have been reading the forums and haven't found anything, because if not I can build outside of backtrader.
Thanks!
-
@bnewell said in Inquiry into speeding up code:
I am running a strategy that has a lot of filters on a daily time frame, and then runs a strategy on one minute data
Can you clarify this statement?
-
@run-out yes, after reading it I was trying to edit and realized it was very generic, excuse me.
so here is a quick example that gets at the heart of my problem. I am essentially running a similar strategy on minute crypto data for the past ten years. The hull filter, volatility ma filter are on the aggregated daily data. This removes ~80% of the data from needing to be tested.
I was wondering if there was a way to move through the aggregated first, then work down to the minute, as opposed to the way the current strategy below is being run, row by row on the minute data.
I could do this outside of backtrader by chunking the data into different groups when the conditions are met, running through backtrader, and reaggregating, but I thought I'd like to check if there was a way to do this before.
Thanks!!
class CryptoBreakout(bt.Strategy): params = ( ("volatility_period", 20), ("volatility_ma_period", 10), ("breakout_period", 50), ("sar_period", 20), ("hull_day_ma", 10) ) def __init__(self): self.data_close = self.datas[0].close self.breakout_level = bt.ind.Highest(self.datas[0].high, period=self.params.breakout_period) self.data_day_close = self.datas[1].closeself.order, self.buyprice, self.buycomm = None, None, None self.day_vol = bt.indicators.StandardDeviation(self.data_day_close, period=self.params.volatility_period) self.day_vol_ma = bt.indicators.MovingAverageSimple(self.day_vol, period=self.params.volatility_ma_period) self.day_ma = bt.indicators.HullMovingAverage(self.data_day_close, period=self.params.hull_day_ma) self.sar = bt.indicators.ParabolicSAR(self.data, period=self.params.sar_period) self.breakout_level.csv, self.day_vol.csv, self.day_vol_ma.csv, self.sar.csv = True, True, True, True def notify_order(self, order): if order.status in [order.Submitted, order.Accepted]: return if order.status in [order.Completed]: if order.isbuy(): self.log(f"BUY EXECUTED, Price: {order.executed.price}, Cost: {order.executed.value}, Comm {order.executed.comm}") self.buyprice, self.buycomm = order.executed.price, order.executed.comm self.bar_executed = len(self) elif order.status in [order.Canceled, order.Margin, order.Rejected]: status_dict = {order.Canceled: "Canceled", order.Margin: "Margin", order.Rejected: "Rejected"} self.log(f"Order Failure - {status_dict[order.status]}") self.order = None def next(self): oustanding_order, outstanding_position = self.order, self.position if oustanding_order: return if not outstanding_position: vol_expanding = self.day_vol > self.day_vol_ma day_price_trend_up = self.data_day_close > self.day_ma intraday_breakout = self.data_close[0] > self.breakout_level[-1] if intraday_breakout and day_price_trend_up and vol_expanding: self.order = self.buy() else: if self.data_close[0] < self.sar[0]: self.log(f"SELL CREATE, {self.data_close[0]}") self.order = self.close() def breakout_v1() -> None: symbol = "BTCUSDT" cerebro = bt.Cerebro() #data = df.pull_binance_to_backtrader_individual(df.BackTraderGranularity.Hour_1, symbol) data = df.pull_binance_to_backtrader_individual(df.BackTraderGranularity.Minute_1, symbol) cerebro.adddata(data) cerebro.resampledata(data, timeframe=bt.TimeFrame.Days, compression=1) cerebro.addstrategy(CryptoBreakout) cerebro.broker.setcash(cash=100000) cerebro.addsizer(bt.sizers.AllInSizer, percents=50) cerebro.addwriter(bt.WriterFile, csv=True, out=os.path.join(per.BACKTESTING_BACKTRADER, f"breakout_{symbol}_{dt.datetime.today().strftime('%Y-%m-%d %H:%M:%S')}.csv")) cerebro.broker.setcommission(commission=0) print('Starting Portfolio Value: %.2f' % cerebro.broker.getvalue()) cerebro.run(maxcpus=1) print('Final Portfolio Value: %.2f' % cerebro.broker.getvalue())
-
@bnewell said in Inquiry into speeding up code:
I was wondering if there was a way to move through the aggregated first, then work down to the minute, as opposed to the way the current strategy below is being run, row by row on the minute data.
bt
is event-driven python backtester. You can't expect it to run with the same speed as array-based C-based backtesters or tools. Probably the best way would be to make as much calculations as possible outside ofbt
, add them as extended data feeds (Pandas DataFrames, for examples) and run the actual backtest usingbt
.Take a look on this chapter as well, maybe
pypy
can help.
https://backtrader.com/blog/2019-10-25-on-backtesting-performance-and-out-of-memory/on-backtesting-performance-and-out-of-memory/ -
@ab_trader said in Inquiry into speeding up code:
d data feeds (Pandas DataFrames,
Ok, that is what I was assuming I would have to do. Thanks!