My (different) data didn't start/end at the same date.

  • Hi, I work on many different stock data. And it not start/end at the same date. When I add all of stock data it'll show only overlap date. how to deal with it ?


  • administrators

    With no code, no output, no chart ... the question would be what's the actual meaning of show in:

    When I add all of stock data it'll show only overlap date

  • Oh, I'm so sorry.

    Here is my code. I just add 2 data (Companies enter the market at 2009-09-02 for AAV symbol and 2012-05-31 for ADVANC symbol) into cerebro and try to run example strategy to print out the data(Close). I noticed that Backtest is start at 2012-05-31. So I can't access data before 2012-05-31. The question is how to add all the data without cut any row off.


    import backtrader as bt
    import os
    import sys
    import datetime
    import pandas as pd
    # Create a Stratey
    class TestStrategy(bt.Strategy):
        def log(self, txt, dt=None):
            ''' Logging function fot this strategy'''
            dt = dt or self.datas[1]
            print('%s, %s' % (dt.isoformat(), txt))
        def __init__(self):
            # Keep a reference to the "close" line in the data[0] dataseries
            self.dataclose = self.datas[1].close
        def next(self):
            # Simply log the closing price of the series from the reference
            self.log('Close, %.2f' % self.dataclose[0])
    if __name__ == '__main__':
        cerebro = bt.Cerebro()
        # Add a strategy
        # Datas are in a subfolder of the samples. Need to find where the script is
        # because it could have been called from anywhere
        ATSET100UNIV = [
            'AAV', # It start at 2012-05-31
            'ADVANC', # It start at 2009-09-02
        datapath = '/Users/tempurak/Work/ls-eq/data/'
        symbols = os.listdir(datapath)
        for file in symbols:
            # Create a Data Feed
            if file[:-4] in ATSET100UNIV:
                data = bt.feeds.GenericCSVData(
                    dataname = datapath + file,
                    fromdate=datetime.datetime(2000, 1, 1),
                    todate=datetime.datetime(2017, 2, 20),
                    nullvalue = 0.0,
                    dtformat = ('%Y-%m-%d'),
                    datetime = 0,
                    open = 1,
                    high = 2,
                    low = 3,
                    close = 4,
                    volume = 5,
                    openinterest = -1)
        # Set our desired cash start
        cerebro.addanalyzer(bt.analyzers.PyFolio, _name='pyfolio')
        print('Starting Portfolio Value: %.2f' %
        print('Final Portfolio Value: %.2f' %

    The Output

    Starting Portfolio Value: 100000.00
    2012-05-31, Close, 3.70
    2012-06-01, Close, 3.28
    2012-06-05, Close, 3.12
    2012-06-06, Close, 3.30

  • administrators

    This is by design. Each object has a minimum period and while that minimum period is not met, the objects dependent on those cannot go into next and remain in next. For data feeds the minimum period is simply1`, i.e.: they have started delivering data.

    Being the reason as simple as:

    def next(self):
        print(self.data0.close[0])  # ok ... starts earlier as self.data1 and has delivered
        print(self.data1.close[0])  # NOK ... kaboom ... exception raised if data1 hadn't started to deliver but it has ...

    The idea is that code NEVER breaks inside next. There is no code guard around self.data1 checking if it has already delivered, because it has. It is guaranteed in next that all objects already deliver.


    You can always forward each call to prenext to next, but you will have to manually check the length of data1 (and associated indicators) to make sure the code doesn't break.

    def next(self):
        print(self.data0.close[0])  # ok ... starts earlier as self.data1 and has delivered
        if len(self.data1):
            print(self.data1.close[0])  # guard placed because it may not have delivered ...

Log in to reply

Looks like your connection to Backtrader Community was lost, please wait while we try to reconnect.