Navigation

    Backtrader Community

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Search
    For code/output blocks: Use ``` (aka backtick or grave accent) in a single line before and after the block. See: http://commonmark.org/help/

    PSAR indicator and multiple timeframes

    General Code/Help
    2
    21
    8227
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • B
      borodiliz last edited by

      Hi all,

      I'm not sure if this is my fault for not understanding how indicators works. I'm trying to use the PSAR indicator for several timeframes (5 and 15 minutes). Using just one period it works fine:

      cerebro.resampledata(
                           data,
                           timeframe=bt.TimeFrame.Minutes,
                           compression=15)
      

      On my strategy, I use the PSAR indicator as follows:

      self.testPsar = bt.indicators.PSAR(self.datas[0])
      

      For just one timeframe It works as expected, my output is:

      2017-01-02 02:30:00, Datas[0].close 1.051830 Psar 1.052140
      2017-01-02 02:45:00, Datas[0].close 1.051700 Psar 1.051720
      2017-01-02 03:00:00, Datas[0].close 1.051540 Psar 1.052300
      2017-01-02 03:15:00, Datas[0].close 1.051520 Psar 1.052300
      2017-01-02 03:30:00, Datas[0].close 1.051110 Psar 1.052279
      2017-01-02 03:45:00, Datas[0].close 1.050800 Psar 1.051985
      2017-01-02 04:00:00, Datas[0].close 1.049010 Psar 1.051716

      The problem is when I use several TimeFrames

      cerebro.resampledata(
                           data,
                           timeframe=bt.TimeFrame.Minutes,
                           compression=5)
      
      cerebro.resampledata(
                           data,
                           timeframe=bt.TimeFrame.Minutes,
                           compression=15)
      

      And my strategy uses datas[1] instead of datas[0]

      self.testPsar = bt.indicators.PSAR(self.datas[1])
      

      The output:

      2017-01-02 02:30:00, Datas[0] 1.051830 Datas[1].close 1.051830 Psar 1.052140
      2017-01-02 02:35:00, Datas[0] 1.052080 Datas[1].close 1.051830 Psar 1.052140
      2017-01-02 02:40:00, Datas[0] 1.051780 Datas[1].close 1.051830 Psar 1.052140
      2017-01-02 02:45:00, Datas[0] 1.051700 Datas[1].close 1.051700 Psar 1.051720
      2017-01-02 02:50:00, Datas[0] 1.051560 Datas[1].close 1.051700 Psar 1.052300
      2017-01-02 02:55:00, Datas[0] 1.051320 Datas[1].close 1.051700 Psar 1.051660
      2017-01-02 03:00:00, Datas[0] 1.051540 Datas[1].close 1.051540 Psar 1.052300
      2017-01-02 03:05:00, Datas[0] 1.051760 Datas[1].close 1.051540 Psar 1.052300
      2017-01-02 03:10:00, Datas[0] 1.051530 Datas[1].close 1.051540 Psar 1.052300
      2017-01-02 03:15:00, Datas[0] 1.051520 Datas[1].close 1.051520 Psar 1.052300
      2017-01-02 03:20:00, Datas[0] 1.050920 Datas[1].close 1.051520 Psar 1.052279
      2017-01-02 03:25:00, Datas[0] 1.051190 Datas[1].close 1.051520 Psar 1.052258
      2017-01-02 03:30:00, Datas[0] 1.051110 Datas[1].close 1.051110 Psar 1.052238
      2017-01-02 03:35:00, Datas[0] 1.051110 Datas[1].close 1.051110 Psar 1.051952
      2017-01-02 03:40:00, Datas[0] 1.051170 Datas[1].close 1.051110 Psar 1.051800
      2017-01-02 03:45:00, Datas[0] 1.050800 Datas[1].close 1.050800 Psar 1.051800
      2017-01-02 03:50:00, Datas[0] 1.050850 Datas[1].close 1.050800 Psar 1.051571
      2017-01-02 03:55:00, Datas[0] 1.050740 Datas[1].close 1.050800 Psar 1.050760
      2017-01-02 04:00:00, Datas[0] 1.049010 Datas[1].close 1.049010 Psar 1.051560
      2017-01-02 04:05:00, Datas[0] 1.047780 Datas[1].close 1.049010 Psar 1.051560
      2017-01-02 04:10:00, Datas[0] 1.048140 Datas[1].close 1.049010 Psar 1.051560
      2017-01-02 04:15:00, Datas[0] 1.048210 Datas[1].close 1.048210 Psar 1.051560

      Should not the PARS indicator be established with the same value for every 15 minutes? (Ie, every 3 intervals)

      Any ideas? Thanks in advance!

      1 Reply Last reply Reply Quote 0
      • B
        backtrader administrators last edited by

        Ideally yes. But some unconnected pieces of code cannot for sure lead to any diagnostic.

        1 Reply Last reply Reply Quote 0
        • B
          borodiliz last edited by borodiliz

          Thanks for your reply @backtrader

          Here's a complete example:

          from __future__ import absolute_import
          from __future__ import division
          from __future__ import print_function
          from __future__ import unicode_literals
          import argparse
          import backtrader as bt
          import backtrader.feeds as btfeeds
          import os.path
          import sys
          
          class PsarMultipleIntervalsStrategy(bt.Strategy):
              def log(self, txt, dt=None, tm=None):
                  ''' Logging function fot this strategy'''
                  dt = dt or self.datas[0].datetime.date(0)
                  tm = tm or self.datas[0].datetime.time(0)
                  print('%s %s, %s' % (dt.isoformat(), tm.isoformat(), txt))
          
              def __init__(self):
          
                  self.data1min = self.datas[0]
                  self.data5min = self.datas[1]
                  self.data15min = self.datas[2]
          
                  # new PSAR indicator for datas[1] (5 mins)
                  self.psar5min = bt.indicators.PSAR(self.data5min)
          
                  # new PSAR indicator for datas[2] (15 mins)
                  self.psar15min = bt.indicators.PSAR(self.data15min)
          
              def next(self):
                  self.log ("data1min.close %.6f data5min.close  %.6f data15min.close %.6f psar5min %.6f psar15min %.6f " % (self.data1min.close[0], self.data5min.close[0], self.data15min.close[0], self.psar5min.psar[0], self.psar15min.psar[0]))
          
          
          def parse_args():
              parser = argparse.ArgumentParser(
                                               formatter_class=argparse.ArgumentDefaultsHelpFormatter,
                                               description='PsarMultipleIntervalsStrategy')
          
              parser.add_argument('--datapath', '-dp',
                                  default='./DAT_ASCII_EURUSD_M1_201705.csv',
                                  help='Absolute path to data')
          
              return parser.parse_args()
          
          if __name__ == '__main__':
          
              args = parse_args()
          
              # Create a cerebro entity
              cerebro = bt.Cerebro()
          
              # Add a strategy
              cerebro.addstrategy(PsarMultipleIntervalsStrategy)
          
              # Datas are in a subfolder of the samples. Need to find where the script is
              # because it could have been called from anywhere
              modpath = os.path.dirname(os.path.abspath(sys.argv[0]))
              datapath = os.path.join(modpath, args.datapath)
          
              ## http://www.histdata.com/f-a-q/
              data = btfeeds.GenericCSVData(
                                            timeframe=bt.TimeFrame.Minutes,
                                            dataname=datapath,
                                            separator=';',
                                            nullvalue=float('NaN'),
                                            dtformat=('%Y%m%d %H%M%S'),
                                            tmformat=('%H%M%S'),
                                            datetime=0,
                                            time=-1,
                                            open=1,
                                            high=2,
                                            low=3,
                                            close=4,
                                            volume=5,
                                            openinterest=-1
                                            )
          
              #Replayer, for 1 min data
              data.replay(
                          timeframe=bt.TimeFrame.Minutes,
                          compression=1)
          
              cerebro.adddata(data)
          
              #Resampler for 5 minutes
              cerebro.resampledata(
                                   data,
                                   timeframe=bt.TimeFrame.Minutes,
                                   compression=5)
          
              #Resampler for 15 minutes
              cerebro.resampledata(
                                   data,
                                   timeframe=bt.TimeFrame.Minutes,
                                   compression=15)
          
              # Run over everything
              cerebro.run(preload=False)
          
          

          Here a link to data used

          Command: pypy psar_test.py --datapath ./DAT_ASCII_EURUSD_M1_201705.csv

          Output:

          2017-05-01 00:30:00, data1min.close 1.089050 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089250 psar15min 1.089340
          2017-05-01 00:31:00, data1min.close 1.089120 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089210 psar15min 1.089340
          2017-05-01 00:32:00, data1min.close 1.089110 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089210 psar15min 1.089340
          2017-05-01 00:33:00, data1min.close 1.089250 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089210 psar15min 1.089340
          2017-05-01 00:34:00, data1min.close 1.089250 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089210 psar15min 1.089340
          2017-05-01 00:35:00, data1min.close 1.089250 data5min.close 1.089250 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
          2017-05-01 00:36:00, data1min.close 1.089200 data5min.close 1.089250 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
          2017-05-01 00:37:00, data1min.close 1.089190 data5min.close 1.089250 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
          2017-05-01 00:38:00, data1min.close 1.089170 data5min.close 1.089250 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
          2017-05-01 00:39:00, data1min.close 1.089160 data5min.close 1.089250 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
          2017-05-01 00:40:00, data1min.close 1.089140 data5min.close 1.089140 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
          2017-05-01 00:41:00, data1min.close 1.089130 data5min.close 1.089140 data15min.close 1.089050 psar5min 1.088995 psar15min 1.089340
          2017-05-01 00:42:00, data1min.close 1.089100 data5min.close 1.089140 data15min.close 1.089050 psar5min 1.089001 psar15min 1.089340
          2017-05-01 00:43:00, data1min.close 1.089080 data5min.close 1.089140 data15min.close 1.089050 psar5min 1.089006 psar15min 1.089340
          2017-05-01 00:45:00, data1min.close 1.089050 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089011 psar15min 1.089340
          2017-05-01 00:46:00, data1min.close 1.089170 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089016 psar15min 1.089333
          2017-05-01 00:47:00, data1min.close 1.089150 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089021 psar15min 1.089326
          2017-05-01 00:48:00, data1min.close 1.089140 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089026 psar15min 1.089319
          2017-05-01 00:49:00, data1min.close 1.089100 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089030 psar15min 1.089313
          2017-05-01 00:50:00, data1min.close 1.089070 data5min.close 1.089070 data15min.close 1.089050 psar5min 1.089035 psar15min 1.089306
          2017-05-01 00:51:00, data1min.close 1.089060 data5min.close 1.089070 data15min.close 1.089050 psar5min 1.089039 psar15min 1.089300
          2017-05-01 00:52:00, data1min.close 1.089040 data5min.close 1.089070 data15min.close 1.089050 psar5min 1.089044 psar15min 1.089294
          2017-05-01 00:53:00, data1min.close 1.089030 data5min.close 1.089070 data15min.close 1.089050 psar5min 1.089048 psar15min 1.089288
          2017-05-01 00:54:00, data1min.close 1.089060 data5min.close 1.089070 data15min.close 1.089050 psar5min 1.089260 psar15min 1.089282
          2017-05-01 00:55:00, data1min.close 1.089040 data5min.close 1.089040 data15min.close 1.089050 psar5min 1.089256 psar15min 1.089276
          2017-05-01 00:56:00, data1min.close 1.089030 data5min.close 1.089040 data15min.close 1.089050 psar5min 1.089209 psar15min 1.089270
          2017-05-01 00:57:00, data1min.close 1.089000 data5min.close 1.089040 data15min.close 1.089050 psar5min 1.089180 psar15min 1.089265
          2017-05-01 00:58:00, data1min.close 1.089020 data5min.close 1.089040 data15min.close 1.089050 psar5min 1.089180 psar15min 1.088990
          2017-05-01 00:59:00, data1min.close 1.089060 data5min.close 1.089040 data15min.close 1.089050 psar5min 1.089180 psar15min 1.088990
          2017-05-01 01:00:00, data1min.close 1.089020 data5min.close 1.089020 data15min.close 1.089020 psar5min 1.089180 psar15min 1.088990

          I think the values of the psar should be established as:

          • psar5min with same value for every 5 intervals (5 minutes)
          • psar15min with same value for every 15 intervals (15 minutes)

          Thanks for this great framework !

          B 1 Reply Last reply Reply Quote 0
          • B
            backtrader administrators @borodiliz last edited by backtrader

            @borodiliz said in PSAR indicator and multiple timeframes:

               #Replayer, for 1 min data
               data.replay(
                           timeframe=bt.TimeFrame.Minutes,
                           compression=1)
               cerebro.adddata(data)
            
               #Resampler for 5 minutes
               cerebro.resampledata(
                                    data,
                                    timeframe=bt.TimeFrame.Minutes,
                                    compression=5)
            
               #Resampler for 15 minutes
               cerebro.resampledata(
                                    data,
                                    timeframe=bt.TimeFrame.Minutes,
                                    compression=15)
            

            The sequence of datas introduced in the system:

            • datas[0] is the replayed version of the data (from replaydata)
            • datas[1] is the 5 minutes resampled data (from resampledata)
            • datas[2] is the 15 minutes resample data (from resampledata)

            Compared to your cohoices

                  self.data1min = self.datas[0]  # a 1 - minute replayed version of the data feed
                  self.data5min = self.datas[1]  # 5 minute data
                  self.data15min = self.datas[2]  # 15 minute data
            

            Because the 1 minute data is replayed and the final point is constantly changing, the value calculated by the other streams is constantly changing with each tick.

            It is unclear what the input data is and why replaydata is being used, but let's assume it is already 1-minute and that no replay is really needed. The logical choice for introducing data in the system would be:

            • adddata(data) (pass timeframe and compression to GenericCSVData to identify the actual timeframe/compression as 1 minute)
            • resampledata(5 minutes)`
            • resampledata(15 minutes)

            If the data is in tick format, you would then

            • resampledata(1 minutes) (pass timeframe and compression to GenericCSVData to identify the actual timeframe/compression as ticks)
            • resampledata(5 minutes)`
            • resampledata(15 minutes)
            1 Reply Last reply Reply Quote 1
            • B
              backtrader administrators last edited by

              Also. It is better to directly use cerebro.replaydata than the combined data.replay and adddata

              1 Reply Last reply Reply Quote 0
              • B
                borodiliz last edited by

                Thanks again for your reply @backtrader .

                The data input is 1-minute data. I've change my code to not use replaydata:

                from __future__ import absolute_import
                from __future__ import division
                from __future__ import print_function
                from __future__ import unicode_literals
                import argparse
                import backtrader as bt
                import backtrader.feeds as btfeeds
                import os.path
                import sys
                
                
                
                class PsarMultipleIntervalsStrategy(bt.Strategy):
                    def log(self, txt, dt=None, tm=None):
                        ''' Logging function fot this strategy'''
                        dt = dt or self.datas[0].datetime.date(0)
                        tm = tm or self.datas[0].datetime.time(0)
                        print('%s %s, %s' % (dt.isoformat(), tm.isoformat(), txt))
                
                    def __init__(self):
                
                        self.data1min = self.datas[0]
                        self.data5min = self.datas[1]   # 5 minute resampled data
                        self.data15min = self.datas[2]  # 15 minute resampled data
                
                        # new PSAR indicator for datas[1] (5 mins)
                        self.psar5min = bt.indicators.PSAR(self.data5min)
                
                        # new PSAR indicator for datas[2] (15 mins)
                        self.psar15min = bt.indicators.PSAR(self.data15min)
                
                    def next(self):
                        self.log ("data1min.close %.6f data5min.close  %.6f data15min.close %.6f psar5min %.6f psar15min %.6f " % (self.data1min.close[0], self.data5min.close[0], self.data15min.close[0], self.psar5min.psar[0], self.psar15min.psar[0]))
                
                
                def parse_args():
                    parser = argparse.ArgumentParser(
                                                     formatter_class=argparse.ArgumentDefaultsHelpFormatter,
                                                     description='PsarMultipleIntervalsStrategy')
                
                    parser.add_argument('--datapath', '-dp',
                                        default='./data/histdata.com/EURUSD/DAT_ASCII_EURUSD_M1_2017.csv',
                                        help='Absolute path to data')
                
                    parser.add_argument('--plot', '-p', action='store_true',
                                        help='Plot the read data')
                
                    return parser.parse_args()
                
                if __name__ == '__main__':
                
                    args = parse_args()
                
                    # Create a cerebro entity
                    cerebro = bt.Cerebro()
                
                    # Add a strategy
                    cerebro.addstrategy(PsarMultipleIntervalsStrategy)
                
                    # Datas are in a subfolder of the samples. Need to find where the script is
                    # because it could have been called from anywhere
                    modpath = os.path.dirname(os.path.abspath(sys.argv[0]))
                    datapath = os.path.join(modpath, args.datapath)
                
                    ## http://www.histdata.com/f-a-q/
                    data = btfeeds.GenericCSVData(
                                                  dataname=datapath,
                                                  separator=';',
                                                  nullvalue=float('NaN'),
                                                  dtformat=('%Y%m%d %H%M%S'),
                                                  tmformat=('%H%M%S'),
                                                  datetime=0,
                                                  time=-1,
                                                  open=1,
                                                  high=2,
                                                  low=3,
                                                  close=4,
                                                  volume=5,
                                                  openinterest=-1,
                                                  timeframe=bt.TimeFrame.Minutes,
                                                  compression=1
                                                  )
                    cerebro.adddata(data)
                
                    #Resampler for 5 minutes
                    cerebro.resampledata(
                                         data,
                                         timeframe=bt.TimeFrame.Minutes,
                                         compression=5)
                
                    #Resampler for 15 minutes
                    cerebro.resampledata(
                                         data,
                                         timeframe=bt.TimeFrame.Minutes,
                                         compression=15)
                
                
                
                    # Run over everything
                    cerebro.run(preload=False)
                
                    # Plot if requested
                    if args.plot:
                        cerebro.plot(
                                     style='bar',
                                     numfigs=1,
                                     volume=True)
                
                

                But the final 5-minute and 15-minute values still seems to be recalculated:

                2017-01-02 20:30:00, data1min.close 1.047690 data5min.close 1.047690 data15min.close 1.047690 psar5min 1.047220 psar15min 1.046580
                2017-01-02 20:31:00, data1min.close 1.047680 data5min.close 1.047690 data15min.close 1.047690 psar5min 1.047220 psar15min 1.046580
                2017-01-02 20:32:00, data1min.close 1.047900 data5min.close 1.047690 data15min.close 1.047690 psar5min 1.047220 psar15min 1.046580
                2017-01-02 20:33:00, data1min.close 1.047880 data5min.close 1.047690 data15min.close 1.047690 psar5min 1.047220 psar15min 1.046580
                2017-01-02 20:34:00, data1min.close 1.047840 data5min.close 1.047690 data15min.close 1.047690 psar5min 1.047220 psar15min 1.046580
                2017-01-02 20:35:00, data1min.close 1.047940 data5min.close 1.047940 data15min.close 1.047690 psar5min 1.047220 psar15min 1.046580
                2017-01-02 20:36:00, data1min.close 1.047910 data5min.close 1.047940 data15min.close 1.047690 psar5min 1.047376 psar15min 1.046580
                2017-01-02 20:37:00, data1min.close 1.047900 data5min.close 1.047940 data15min.close 1.047690 psar5min 1.047501 psar15min 1.046580
                2017-01-02 20:38:00, data1min.close 1.047960 data5min.close 1.047940 data15min.close 1.047690 psar5min 1.047601 psar15min 1.046580
                2017-01-02 20:39:00, data1min.close 1.047980 data5min.close 1.047940 data15min.close 1.047690 psar5min 1.047620 psar15min 1.046580
                2017-01-02 20:40:00, data1min.close 1.048050 data5min.close 1.048050 data15min.close 1.047690 psar5min 1.047620 psar15min 1.046580
                2017-01-02 20:41:00, data1min.close 1.048040 data5min.close 1.048050 data15min.close 1.047690 psar5min 1.047660 psar15min 1.046580
                2017-01-02 20:42:00, data1min.close 1.048040 data5min.close 1.048050 data15min.close 1.047690 psar5min 1.047660 psar15min 1.046580
                2017-01-02 20:43:00, data1min.close 1.048040 data5min.close 1.048050 data15min.close 1.047690 psar5min 1.047660 psar15min 1.046580
                2017-01-02 20:44:00, data1min.close 1.047900 data5min.close 1.048050 data15min.close 1.047690 psar5min 1.047660 psar15min 1.046580
                2017-01-02 20:45:00, data1min.close 1.047930 data5min.close 1.047930 data15min.close 1.047930 psar5min 1.047660 psar15min 1.046580
                2017-01-02 20:46:00, data1min.close 1.047640 data5min.close 1.047930 data15min.close 1.047930 psar5min 1.047750 psar15min 1.046878
                2017-01-02 20:47:00, data1min.close 1.047530 data5min.close 1.047930 data15min.close 1.047930 psar5min 1.047821 psar15min 1.047116
                2017-01-02 20:48:00, data1min.close 1.047510 data5min.close 1.047930 data15min.close 1.047930 psar5min 1.047860 psar15min 1.047220
                2017-01-02 20:49:00, data1min.close 1.047580 data5min.close 1.047930 data15min.close 1.047930 psar5min 1.047860 psar15min 1.047220
                2017-01-02 20:50:00, data1min.close 1.047500 data5min.close 1.047500 data15min.close 1.047930 psar5min 1.048070 psar15min 1.047220
                2017-01-02 20:51:00, data1min.close 1.047510 data5min.close 1.047500 data15min.close 1.047930 psar5min 1.048060 psar15min 1.047220
                2017-01-02 20:52:00, data1min.close 1.047580 data5min.close 1.047500 data15min.close 1.047930 psar5min 1.048060 psar15min 1.047220
                2017-01-02 20:53:00, data1min.close 1.047650 data5min.close 1.047500 data15min.close 1.047930 psar5min 1.048060 psar15min 1.047220
                2017-01-02 20:54:00, data1min.close 1.047770 data5min.close 1.047500 data15min.close 1.047930 psar5min 1.048060 psar15min 1.047220
                2017-01-02 20:55:00, data1min.close 1.047830 data5min.close 1.047830 data15min.close 1.047930 psar5min 1.048060 psar15min 1.047220
                2017-01-02 20:56:00, data1min.close 1.047650 data5min.close 1.047830 data15min.close 1.047930 psar5min 1.048049 psar15min 1.047220
                2017-01-02 20:57:00, data1min.close 1.047630 data5min.close 1.047830 data15min.close 1.047930 psar5min 1.048038 psar15min 1.047220
                2017-01-02 20:58:00, data1min.close 1.047610 data5min.close 1.047830 data15min.close 1.047930 psar5min 1.048027 psar15min 1.047220
                2017-01-02 20:59:00, data1min.close 1.047550 data5min.close 1.047830 data15min.close 1.047930 psar5min 1.048017 psar15min 1.047220
                2017-01-02 21:00:00, data1min.close 1.047650 data5min.close 1.047650 data15min.close 1.047650 psar5min 1.048006 psar15min 1.047220

                I've tryed even adding different feeds to avoid data relation, but same results:

                   ## http://www.histdata.com/f-a-q/
                    data = btfeeds.GenericCSVData(
                                                  dataname=datapath,
                                                  separator=';',
                                                  nullvalue=float('NaN'),
                                                  dtformat=('%Y%m%d %H%M%S'),
                                                  tmformat=('%H%M%S'),
                                                  datetime=0,
                                                  time=-1,
                                                  open=1,
                                                  high=2,
                                                  low=3,
                                                  close=4,
                                                  volume=5,
                                                  openinterest=-1,
                                                  timeframe=bt.TimeFrame.Minutes,
                                                  compression=1
                                                  )
                
                    ## http://www.histdata.com/f-a-q/
                    data5 = btfeeds.GenericCSVData(
                                                   dataname=datapath,
                                                   separator=';',
                                                   nullvalue=float('NaN'),
                                                   dtformat=('%Y%m%d %H%M%S'),
                                                   tmformat=('%H%M%S'),
                                                   datetime=0,
                                                   time=-1,
                                                   open=1,
                                                   high=2,
                                                   low=3,
                                                   close=4,
                                                   volume=5,
                                                   openinterest=-1,
                                                   timeframe=bt.TimeFrame.Minutes,
                                                   compression=1
                                                   )
                
                    ## http://www.histdata.com/f-a-q/
                    data15 = btfeeds.GenericCSVData(
                                                    dataname=datapath,
                                                    separator=';',
                                                    nullvalue=float('NaN'),
                                                    dtformat=('%Y%m%d %H%M%S'),
                                                    tmformat=('%H%M%S'),
                                                    datetime=0,
                                                    time=-1,
                                                    open=1,
                                                    high=2,
                                                    low=3,
                                                    close=4,
                                                    volume=5,
                                                    openinterest=-1,
                                                    timeframe=bt.TimeFrame.Minutes,
                                                    compression=1
                                                    )
                    cerebro.adddata(data)
                
                    #Resampler for 5 minutes
                    cerebro.resampledata(
                                         data5,
                                         timeframe=bt.TimeFrame.Minutes,
                                         compression=5)
                
                    #Resampler for 15 minutes
                    cerebro.resampledata(
                                         data15,
                                         timeframe=bt.TimeFrame.Minutes,
                                         compression=15)
                

                Just to clarify what I'm trying to do, given the following plot:

                alt text

                I'm trying to get the last 5-minute and 15-minute psar points on a 1-minute strategy. Is there any way to do it with backtrader framwork? Should I do it manually by storing the latest 5-minute and 15-minute values?

                Thanks in advance!

                1 Reply Last reply Reply Quote 0
                • B
                  backtrader administrators last edited by

                  The code presented above should really work (unlike the one with the dangling replayed point) which is a symptom of:

                  • A non idempotent next function in the indicator

                  Before the review and recalling the principles on which Wilder based the indicator, there is, amongst many others a virtual position. The reason for recalling next during each iteration is to support data replaying in wich the current bar can change.

                  Obviously Wilder had such a status because he didn't have such a scenario in mind.

                  The implementation will be reviewed

                  1 Reply Last reply Reply Quote 0
                  • B
                    borodiliz last edited by borodiliz

                    Thank you @backtrader for your prompt response. I'm sure your average response time is better than lot of commercial software companies!

                    After a further investigation on this issue I've the following conclusions:

                    • The next function on psar indicator for data[1] 5-minutes and data[2] 15-minutes is called every a new bar in data[0] 1-minuteis generated
                    • A new psar value is being calculated, and due to the psar Acceleration, the psar value is changing
                    • This should not happens because we are processing the same bar and not a new one, so the psar value should be the same.

                    With the above in mind and as a newbie on both backtrader and backtesting , I've solved this issue with these changes

                    Now the PSAR values make sense:

                    2017-05-01 00:30:00, data1min.close 1.089050 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089257 psar15min 1.089340
                    2017-05-01 00:31:00, data1min.close 1.089120 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089257 psar15min 1.089340
                    2017-05-01 00:32:00, data1min.close 1.089110 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089257 psar15min 1.089340
                    2017-05-01 00:33:00, data1min.close 1.089250 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089257 psar15min 1.089340
                    2017-05-01 00:34:00, data1min.close 1.089250 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.089257 psar15min 1.089340
                    2017-05-01 00:35:00, data1min.close 1.089250 data5min.close 1.089250 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
                    2017-05-01 00:36:00, data1min.close 1.089200 data5min.close 1.089250 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
                    2017-05-01 00:37:00, data1min.close 1.089190 data5min.close 1.089250 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
                    2017-05-01 00:38:00, data1min.close 1.089170 data5min.close 1.089250 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
                    2017-05-01 00:39:00, data1min.close 1.089160 data5min.close 1.089250 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
                    2017-05-01 00:40:00, data1min.close 1.089140 data5min.close 1.089140 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
                    2017-05-01 00:41:00, data1min.close 1.089130 data5min.close 1.089140 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
                    2017-05-01 00:42:00, data1min.close 1.089100 data5min.close 1.089140 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
                    2017-05-01 00:43:00, data1min.close 1.089080 data5min.close 1.089140 data15min.close 1.089050 psar5min 1.088990 psar15min 1.089340
                    2017-05-01 00:45:00, data1min.close 1.089050 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.088995 psar15min 1.089340
                    2017-05-01 00:46:00, data1min.close 1.089170 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.088995 psar15min 1.089340
                    2017-05-01 00:47:00, data1min.close 1.089150 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.088995 psar15min 1.089340
                    2017-05-01 00:48:00, data1min.close 1.089140 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.088995 psar15min 1.089340
                    2017-05-01 00:49:00, data1min.close 1.089100 data5min.close 1.089050 data15min.close 1.089050 psar5min 1.088995 psar15min 1.089340
                    2017-05-01 00:50:00, data1min.close 1.089070 data5min.close 1.089070 data15min.close 1.089050 psar5min 1.089001 psar15min 1.089340
                    2017-05-01 00:51:00, data1min.close 1.089060 data5min.close 1.089070 data15min.close 1.089050 psar5min 1.089001 psar15min 1.089340
                    2017-05-01 00:52:00, data1min.close 1.089040 data5min.close 1.089070 data15min.close 1.089050 psar5min 1.089001 psar15min 1.089340
                    2017-05-01 00:53:00, data1min.close 1.089030 data5min.close 1.089070 data15min.close 1.089050 psar5min 1.089001 psar15min 1.089340
                    2017-05-01 00:54:00, data1min.close 1.089060 data5min.close 1.089070 data15min.close 1.089050 psar5min 1.089001 psar15min 1.089340
                    2017-05-01 00:55:00, data1min.close 1.089040 data5min.close 1.089040 data15min.close 1.089050 psar5min 1.089006 psar15min 1.089340
                    2017-05-01 00:56:00, data1min.close 1.089030 data5min.close 1.089040 data15min.close 1.089050 psar5min 1.089006 psar15min 1.089340
                    2017-05-01 00:57:00, data1min.close 1.089000 data5min.close 1.089040 data15min.close 1.089050 psar5min 1.089006 psar15min 1.089340
                    2017-05-01 00:58:00, data1min.close 1.089020 data5min.close 1.089040 data15min.close 1.089050 psar5min 1.089006 psar15min 1.089340
                    2017-05-01 00:59:00, data1min.close 1.089060 data5min.close 1.089040 data15min.close 1.089050 psar5min 1.089006 psar15min 1.089340
                    2017-05-01 01:00:00, data1min.close 1.089020 data5min.close 1.089020 data15min.close 1.089020 psar5min 1.089260 psar15min 1.089333

                    For sure my implementation is not the best. I think the next function on all indicators should only be called when a new bar is available for the data the indicator belongs to, not for all bars on every datas.

                    Doubt: Is this issue affecting other indicators that should only change when a "new" bar is generated?

                    Thanks again!

                    1 Reply Last reply Reply Quote 0
                    • B
                      backtrader administrators last edited by

                      Thanks for the extra input. The problem was actually already identified and hopefully may be even later released today. It takes a different approach to the solve the root cause.

                      1 Reply Last reply Reply Quote 0
                      • B
                        backtrader administrators last edited by

                        You can try this from the development branch. Any recalculation will keep the status in a length controlled array (with only 2 elements). A switch to the other status will only happen with a change in length.

                        • Commit - 22405936eacae470c2d8d0528a4ad1b86a986787
                        1 Reply Last reply Reply Quote 0
                        • B
                          backtrader administrators last edited by backtrader

                          And a new sample with intraday resampling (5 to 15 minutes) using PSAR has also been committed to development. To be found under:

                          • samples/psar/psar-intraday.py

                          The last part of the output shows how a change in the PSAR value for the resampled data (the 2nd set of values) only happens when the datetime changes. Of course the PSAR can remain static even if the time changes.

                          2127,2127,2006-01-30 16:15:00,3682.94,PSAR,3675.84,0709,2006-01-30 16:15:00,3682.94,PSAR,3672.43
                          2128,2128,2006-01-30 16:20:00,3682.59,PSAR,3677.34,0709,2006-01-30 16:15:00,3682.94,PSAR,3672.43
                          2129,2129,2006-01-30 16:25:00,3682.53,PSAR,3678.98,0709,2006-01-30 16:15:00,3682.94,PSAR,3672.43
                          2130,2130,2006-01-30 16:30:00,3681.70,PSAR,3680.26,0710,2006-01-30 16:30:00,3681.70,PSAR,3672.69
                          2131,2131,2006-01-30 16:35:00,3680.63,PSAR,3680.36,0710,2006-01-30 16:30:00,3681.70,PSAR,3672.69
                          2132,2132,2006-01-30 16:40:00,3682.26,PSAR,3680.45,0710,2006-01-30 16:30:00,3681.70,PSAR,3672.69
                          2133,2133,2006-01-30 16:45:00,3683.04,PSAR,3680.45,0711,2006-01-30 16:45:00,3683.04,PSAR,3672.94
                          2134,2134,2006-01-30 16:50:00,3684.04,PSAR,3681.39,0711,2006-01-30 16:45:00,3683.04,PSAR,3672.94
                          2135,2135,2006-01-30 16:55:00,3685.65,PSAR,3682.14,0711,2006-01-30 16:45:00,3683.04,PSAR,3672.94
                          2136,2136,2006-01-30 17:00:00,3681.90,PSAR,3685.65,0712,2006-01-30 17:00:00,3681.90,PSAR,3673.18
                          2137,2137,2006-01-30 17:05:00,3679.43,PSAR,3685.65,0712,2006-01-30 17:00:00,3681.90,PSAR,3673.18
                          2138,2138,2006-01-30 17:10:00,3677.14,PSAR,3685.38,0712,2006-01-30 17:00:00,3681.90,PSAR,3673.18
                          2139,2139,2006-01-30 17:15:00,3679.97,PSAR,3683.57,0713,2006-01-30 17:15:00,3679.97,PSAR,3675.68
                          2140,2140,2006-01-30 17:20:00,3678.68,PSAR,3682.01,0713,2006-01-30 17:15:00,3679.97,PSAR,3675.68
                          2141,2141,2006-01-30 17:25:00,3679.28,PSAR,3680.82,0713,2006-01-30 17:15:00,3679.97,PSAR,3675.68
                          2142,2142,2006-01-30 17:30:00,3677.52,PSAR,3680.14,0714,2006-01-30 17:30:00,3677.52,PSAR,3685.65
                          

                          0_1496181889218_89486251-7671-4eec-a0ef-ada766038259-image.png

                          1 Reply Last reply Reply Quote 0
                          • B
                            borodiliz last edited by borodiliz

                            Just for testing purposes I've modified your psar-indraday.py to accept the following parameters:

                            • --adddata1m to add the 1 minute data (no data feed is added by default)
                            • --addresample10m to add a resample for 10 mins
                            • --addresample15m to add a resample for 15 mins

                            I've executed the following commands:

                            • psar-intraday.py --adddata1m
                            • psar-intraday.py --addresample10m
                            • psar-intraday.py --addresample15m
                            • psar-intraday.py --adddata1m addresample10m addresample15m

                            On the following branches:

                            • master@mementum/backtrader
                            • development@mementum/backtrader
                            • PSAR_multiple_timeframes@borodiliz/backtrader

                            Please check all results here (only first 30 lines)

                            Keep in mind I'm not sure what are the correct values for even a unique data feed, I'm only checking for differences between all the implementations!.

                            My conclusions:

                            • PSAR calculations for a single data has changed with your new implementation (diffs between master to development)
                            • PSAR calculations for a single data has NOT changed from master to borodiliz (diffs between master to borodiliz)
                            • PSAR calculations for a multiple data feeds seems "weird" in development@mementum/backtrader

                            Thanks again for your time!

                            B 1 Reply Last reply Reply Quote 0
                            • B
                              backtrader administrators @borodiliz last edited by

                              @borodiliz said in PSAR indicator and multiple timeframes:

                              PSAR calculations for a single data has changed with your new implementation (diffs between master to development)

                              Only the initial calculation has changed due to keeping the status in between calls unless the length has changed. Out of 512 bars from the sample these are the differences between old (-) and new (+) (using diff -u). Take into account that the period is set to 20 in the sample.

                              -  20,2005-01-28,2936.16
                              -  21,2005-01-31,2946.20
                              -  22,2005-02-01,2950.68
                              -  23,2005-02-02,2958.07
                              -  24,2005-02-03,2977.51
                              -  25,2005-02-04,2992.34
                              -  26,2005-02-07,3001.87
                              +  20,2005-01-28,2889.99
                              +  21,2005-01-31,2909.79
                              +  22,2005-02-01,2925.63
                              +  23,2005-02-02,2943.94
                              +  24,2005-02-03,2962.88
                              +  25,2005-02-04,2978.73
                              +  26,2005-02-07,2995.42
                                 27,2005-02-08,3011.58
                              -  28,2005-02-09,3029.25
                              +  28,2005-02-09,3026.46
                                 29,2005-02-10,3064.30
                                 30,2005-02-11,3035.37
                                 31,2005-02-14,3035.37
                              

                              After line 28, there is absolutely no difference in the calculation. And that shows that there is a gap in the initial status setup. ep was not being put into the status objects in one of the cases. The per bar calculation once the minimum period has been reached was unchanged and remains unchanged.

                              This commit in the development branch addresses the initial status:

                              • https://github.com/mementum/backtrader/commit/f097b7b316b9552e33a5069cee69625b32d52f1c
                              B 1 Reply Last reply Reply Quote 0
                              • B
                                borodiliz @backtrader last edited by

                                @backtrader said in PSAR indicator and multiple timeframes:

                                This commit in the development branch addresses the initial status:

                                https://github.com/mementum/backtrader/commit/f097b7b316b9552e33a5069cee69625b32d52f1c

                                Now it works like a charm. Thanks for solve this @backtrader !

                                1 Reply Last reply Reply Quote 0
                                • B
                                  borodiliz last edited by

                                  Continuing with my PSAR tests now I have problems using TA-Lib

                                  Given this test https://github.com/borodiliz/backtrader/tree/psar-talib/samples/psar-talib

                                  In which there are two CSV files with the same information but in two different timeframes:

                                  • 5-minutes
                                  • 60-minutes

                                  I've done the following tests:

                                  Testing the built-in PSAR indicator

                                  data0 => 60-minute. No resampling.

                                  psar-talib-vs-backtrader.py --adddata --data data/60m.txt --compression 60 --plot --strat talib=False
                                  0_1497019754249_psar-backtrader-data60m.png

                                  data0 => 5-minute . data1 => resample data0 to 60-minute:

                                  psar-talib-vs-backtrader.py --adddata --data data/5m.txt --compression 5 --addresample60m --plot --strat talib=False
                                  0_1497019767989_psar-backtrader-data5m-resample60m.png

                                  OK! The problem previously explained before on this thread is solved now, the 60-minute data is the same on both tests

                                  Testing the TA-Lib PSAR indicator

                                  Now same tests but using TA-Lib

                                  data0 => 60-minute. No resampling.

                                  psar-talib-vs-backtrader.py --adddata --data data/60m.txt --compression 60 --plot --strat talib=True
                                  0_1497019781302_psar-talib-data60m.png
                                  psar-talib-vs-backtrader.py --adddata --data data/5m.txt --compression 5 --addresample60m --plot --strat talib=True

                                  data0 => 5-minute . data1 => resample data0 to 60-minute:

                                  0_1497019796637_psar-talib-data5m-resample60m.png

                                  Wops!, Why are so different the 60-minute PSAR calculations when resampling?

                                  Another question: Even without data resampling, PSAR results are different on the PSAR built-in to TA-Lib. Any idea?

                                  Thanks again for your time

                                  1 Reply Last reply Reply Quote 0
                                  • B
                                    backtrader administrators last edited by backtrader

                                    Without knowing exactly what the TA-LIB implementation of ParabolicSAR does, the following things could apply:

                                    • The default value selected for the parameters may be different.

                                      But it doesn't seem to be, because both charts show 0.02 and 0.2 which are for af(acceleration factorandafmax` (maximum acceleration factor)

                                    • The choice as to what the price has penetrated ... means includes >= or > (or the opposite for operators for a downtrend)

                                      This is left open in the book by Welles Wilder which is the only source.

                                      But even if this could account for some of the differences it doesn't seem to account for the larger effects.

                                    Larger Effects

                                    • Without claiming that the implementation in backtrader is right (which may or may not be), looking at the non-resampled 60 minutes chat and focusing on the values between 2006-01-13 and 2006-01-16, the ParabolicSAR in TA-LIB seems to not accelerate downwards following the price trend.

                                    • Actually every downwards trend in the TA-LIB versions seem to lack the parabolic acceleration and move mostly in a linear fashion, whereas the uptrends seem very similar (if not equal) in both backtrader and TA-LIB

                                    Resampled Version

                                    • This would actually require looking into the sources of TA-LIB, but because the version in backtrader had to keep an status in between calls, the implementors of TA-LIB may have faced the same situation and the status kept in between calls is affecting the results (again: this is just an assumption with no source code having been seen at all)
                                    1 Reply Last reply Reply Quote 0
                                    • B
                                      borodiliz last edited by borodiliz

                                      Thanks for your reply @backtrader

                                      Just focusing on using TA-Lib and why the 60-minute re-sampled data is different to the 60-minute non resampled data I've done some tests:

                                      psar-talib-vs-backtrader.py --adddata --data data/60m.txt --compression 60 --strat talib=True --cerebro runonce=True --plot

                                      0_1497269004754_psar_talib_runonce_true.png

                                      psar-talib-vs-backtrader.py --adddata --data data/60m.txt --compression 60 --strat talib=True --cerebro runonce=False --plot

                                      0_1497268936993_psar_talib_runonce_false.png

                                      And it make sense for me, because as I understand:

                                      • If runonce=True all data is passed to TA-Lib (Using talib.py:once() ), so TA-Lib keeps the status for the parabolic acceleration
                                      • If runonce=False backtrader only pass 2 values to TA-Lib (Using talib.py:next() ), so TA-Lib can't keep the status

                                      So, my question:

                                      Is there any way to "force" the number of datas (i.e: the narrays array on talib.py:next() ) backtrader should pass to TA-Lib ? I've tried the following without success:

                                      self.psar0 = bt.talib.SAR(self.datas[0].high, self.datas[0].low, timeperiod=30)
                                      

                                      My idea is to set timeperiod to maximum acceleration / acceleration = 0.2 / 0.02= 10 periods (default)

                                      1 Reply Last reply Reply Quote 0
                                      • B
                                        borodiliz last edited by

                                        For those interested I've found a temporally solution:

                                                        self.psar0 = bt.talib.SAR(self.datas[0].high, self.datas[0].low)
                                                        self.psar0._lookback = None
                                        

                                        As I understand setting _lookback to None on a TA-Lib indicator we're forcing backtrader to pass TA-Lib len(self) i.e: "as much data as possible"

                                        I'm not sure if this is the right solution and what performance issues could arise when working with very large data feeds... but it seems to work. Now the 60-minute TA-Lib ParabolicSAR indicator seems OK on all scenarios (resampled data and runonce=False)

                                        1 Reply Last reply Reply Quote 0
                                        • B
                                          backtrader administrators last edited by

                                          _lookback=0 should also work. This is automatically done if the indicator is marked as unstable in TA-LIB.

                                          The inclusion of TA-LIB is completely automated based on the meta-information available in each TA-LIB function (aka indicator).

                                          In this case it seems that the SAR in TA-LIB is not marked and backtrader uses the available minimum period information.

                                          The alternative would be: backtrader is not parsing the information properly, but the EMA which is marked as unstable by TA-LIB is working properly with both runonce=True and runonce=False

                                          A specific exception would be needed to handle the SAR case and marking it as unstable (which it actually is) disregarding the actual meta-information from TA-LIB

                                          1 Reply Last reply Reply Quote 0
                                          • B
                                            backtrader administrators last edited by

                                            An exception has been added in the automated TA-LIB integration to mark the SAR indicator/function as unstable, which will automatically set _lookback=0

                                            Already in the development branch

                                            1 Reply Last reply Reply Quote 0
                                            • 1
                                            • 2
                                            • 1 / 2
                                            • First post
                                              Last post
                                            Copyright © 2016, 2017, 2018, 2019, 2020, 2021 NodeBB Forums | Contributors