Algotopian is starting. Let it be the future of backtrader. Check it out!
For code/output blocks: Use ``` (aka backtick or grave accent) in a single line before and after the block. See:

.resampledata() skips data

  • @backtrader seems we have discussed this before and not sure if this is an issue with the code, or an issue with my memory...

    I've mentioned I occasionally see the system skipping or not showing the most recent data when using backfill_from=.

    With the following example code, I see the following scenarios:

    1. Using .resampledata() to "register" the feed, with backfill_from= using the local static data, the output I see from the bar print in next() shows every other bar.

    2. Using .resampledata() without backfill_from= never gets to a LIVE data status with the IB connection.

    3. Using .adddata() shows the expected output.

    The code may be a bit more complicated than needed, but I have been using it to track down why I am not seeing valid indicator output when running live vs. in backtest (which may be related)

    #!/usr/bin/env python
    # -*- coding: utf-8; py-indent-offset:4 -*-
    from __future__ import (absolute_import, division, print_function,
    import datetime as dt
    import pytz
    # Import the backtrader platform
    import backtrader as bt
    import backtrader.feeds as btfeed
    import backtrader.indicators as btind
    EST = pytz.timezone('US/Eastern')
    class TestDVStrategy(bt.Strategy):
        def log(self, txt, dt=None):
            ''' Logging function for this strategy'''
            dt = dt or self.data_spy.datetime.datetime(tz=EST)
            lstr = 'TEST: {}: {}'.format(dt.isoformat(), txt)
        def __init__(self):
            # To keep track of pending orders and buy price/commission
            self.trend = None
            self.datastatus = False
            self.data_spy = self.getdatabyname('SPY')
            # Add a MovingAverageSimple indicator based from SPY
            self.sma = btind.SMA(self.data_spy.close, period=200)
        def notify_data(self, data, status, *args, **kwargs):
            print('*' * 3, '%s DATA NOTIF: %s' %
                   data._getstatusname(status)), *args)
            if status == data.LIVE:
                self.datastatus = True
        def notify_store(self, msg, *args, **kwargs):
            print('*' * 3, '%s STORE NOTIF: %s' %
                  ('%Y-%m-%dT%H:%M:%S'), msg))
        def next(self):
            if len(self) < 4:
            if self.sma[-1] > self.sma[-2] > self.sma[-3] > self.sma[-4]:
                self.trend = 'up'
                self.trend = 'down'
            print('-- %004d' % len(self), str(self.datetime.datetime()))
            for i, d in enumerate(di for di in self.datas if len(di)):
                out = ['Data%d' % i, d._name, '%004d' % len(d),
                       str(d.datetime.datetime()), str(d.close[0])]
            self.log('TEST: %.2f SMA: %.3f TREND: %s' %
    def runstrategy():
        # Create a cerebro entity
        cerebro = bt.Cerebro()
        # Add a strategy
        # Parse static SPY data file
        bfdata0 = btfeed.GenericCSVData(dataname='./datas/SPY-1D.csv',
                                        sessionstart=dt.time(9, 30),
                                        sessionend=dt.time(16, 0),
        storekwargs = dict(
        ibstore = bt.stores.IBStore(**storekwargs)
        broker = ibstore.getbroker()
        # SPY Live data timeframe resampled to 1 Day
        data0 = ibstore.getdata(dataname='SPY-STK-SMART-USD',
                                # backfill_from=bfdata0,
                                timeframe=bt.TimeFrame.Days, compression=1,
                                sessionstart=dt.time(9, 30),
                                sessionend=dt.time(16, 0),
        # cerebro.resampledata(data0, name='SPY',
        #                      timeframe=bt.TimeFrame.Days, compression=1)
        cerebro.adddata(data0, name='SPY'), tradehistory=True, exactbars=0)
    if __name__ == '__main__':

  • administrators

    Worth looking into it.

    1. Using .adddata() shows the expected output.

    This statement is probably not meant so. Because without resampledata, the feed returned by the IBStore will download data in Days (because the timeframe indicates it to do so), but it will later deliver the ticks.

    It would be much better to pass bt.TimeFrame.Ticks (which is what the feed actually delivers) and let resampledata do its thing to download the historical data in bt.TimeFrame.Days

    This is an area which would probably benefit from some rework (you already mentioned it and it is done with the VisualChart live feed, for other reasons):

    • Having getdata (actually IBData) to directly return a resampled version of itself for the given timeframe

    What would also require:

    • To separate the historical (for backfilling) download from the internals of IBData

    In order to be able to do a 2nd resampledata (and a 3rd if needed be) with a different (larger, smaller) timeframe/compression combination.

    Some extra thought should have gone into the aforementioned bullet points at the time of development.

  • I somehow think the backfill_from= is part of the issue here. Also, not sure from your comments but note that in this script, I am only requesting one feed from IB.

    The issue is seen when I backfill static data and for some reason, when using the .resampledata() method, the system is dropping every other bar. This is identical to another issue we were seeing when I was trying to .resampledata() from the InfluxDB feed. In that case, .adddata() produced all bars. .resampledata() dropped every other bar.

  • administrators

    @RandyT said in .resampledata() skips data:

    The issue is seen when I backfill static data and for some reason, when using the .resampledata() method, the system is dropping every other bar. This is identical to another issue we were seeing when I was trying to .resampledata() from the InfluxDB feed. In that case, .adddata() produced all bars. .resampledata() dropped every other bar.

    This issue was due to not having specified the sessionend for the data, which collided with the algorithm in charge of knowing when the day has come to an end. This was also the reason to add the default sessionend to, at least, all CSV data feeds, which has uncovered that some users were even not actually specifying the incoming timeframe, which was confusing the resampling code.

Log in to reply

Looks like your connection to Backtrader Community was lost, please wait while we try to reconnect.