For code/output blocks: Use ``` (aka backtick or grave accent) in a single line before and after the block. See: http://commonmark.org/help/

Getting error running Optimization on data from IB



  • Hi,
    While trying to use optimization(optstrategy) on data(live or historical), fetched from Interactive Broker, I am getting this error :

    Traceback (most recent call last):
      File "D:/Users/R/BackTestLiveIBv2.py", line 219, in <module>
        runstrategy()
      File "D:/Users/R/BackTestLiveIBv2.py", line 216, in runstrategy
        cerebro.run()
      File "C:\Users\R\Anaconda3\lib\site-packages\backtrader\cerebro.py", line 1143, in run
        for r in pool.imap(self, iterstrats):
      File "C:\Users\R\Anaconda3\lib\multiprocessing\pool.py", line 735, in next
        raise value
      File "C:\Users\R\Anaconda3\lib\multiprocessing\pool.py", line 424, in _handle_tasks
        put(task)
      File "C:\Users\R\Anaconda3\lib\multiprocessing\connection.py", line 206, in send
        self._send_bytes(_ForkingPickler.dumps(obj))
      File "C:\Users\R\Anaconda3\lib\multiprocessing\reduction.py", line 51, in dumps
        cls(buf, protocol).dump(obj)
    TypeError: can't pickle _thread.lock objects
    

    Interestingly, with addstrategy it is running fine and gives this output

    Server Version: 76
    TWS Time at connection:20200907 00:12:41 India Standard Time
    Datetime, Open, High, Low, Close, Volume, OpenInterest, SMA
    ***** DATA NOTIF: DELAYED
    0001, 2019-09-06 23:59:59.999989, 72.0, 72.0, 71.5, 71.75, 0.0, 0.0, nan
    0002, 2019-09-09 23:59:59.999989, 71.75, 71.75, 71.25, 71.75, 0.0, 0.0, nan
    0003, 2019-09-11 23:59:59.999989, 71.75, 72.0, 71.5, 71.625, 0.0, 0.0, nan
    0004, 2019-09-12 23:59:59.999989, 71.5, 71.5, 71.0, 71.125, 0.0, 0.0, nan
    0005, 2019-09-13 23:59:59.999989, 71.0, 71.25, 70.75, 70.875, 0.0, 0.0, 71.425
    0006, 2019-09-16 23:59:59.999989, 71.75, 71.75, 71.25, 71.625, 0.0, 0.0, 71.4
    0007, 2019-09-17 23:59:59.999989, 71.75, 72.0, 71.5, 71.75, 0.0, 0.0, 71.4
    .....
    .....
    0238, 2020-09-01 23:59:59.999989, 73.25, 73.25, 72.75, 72.875, 0.0, 0.0, 73.6
    0239, 2020-09-02 23:59:59.999989, 73.0, 73.25, 72.75, 73.0, 0.0, 0.0, 73.35
    0240, 2020-09-03 23:59:59.999989, 73.25, 73.75, 73.0, 73.5, 0.0, 0.0, 73.275
    0241, 2020-09-04 23:59:59.999989, 73.5, 73.5, 73.0, 73.125, 0.0, 0.0, 73.225
    ***** DATA NOTIF: DISCONNECTED
    
    

    I need help to resolve this.
    Thanks.

    Here is the code

    import argparse
    import datetime
    import backtrader as bt
    from backtrader.utils import flushfile  
    import os
    
    class TestStrategy(bt.Strategy):
    
        params = dict(
            smaperiod = 5,
            trade=False,
            stake=10,
            exectype=bt.Order.Market,
            stopafter=0,
            valid=None,
            cancel=0,
            donotsell=False,
            optim=False,
            optimParams= (0, 0),
        )
    
        def __init__(self):
            # To control operation entries
            self.orderid = list()
            self.order = None
    
            self.counttostop = 0
            self.datastatus = 0
    
            # Create SMA on 2nd data
            if self.p.optim:  # Use a tuple during optimization
                self.p.smaperiod,self.p.stake = self.p.optimParams
            self.sma = bt.indicators.MovAv.SMA(self.data, period=self.p.smaperiod)
    
    
        def notify_data(self, data, status, *args, **kwargs):
            print('*' * 5, 'DATA NOTIF:', data._getstatusname(status), *args)
            if status == data.LIVE:
                self.counttostop = self.p.stopafter
                self.datastatus = 1
    
        def notify_order(self, order):
            if order.status in [order.Completed, order.Cancelled, order.Rejected]:
                self.order = None
    
            print('-' * 50, 'ORDER BEGIN', datetime.datetime.now())
            # print(order)
            print('-' * 50, 'ORDER END')
    
        def notify_trade(self, trade):
            print('-' * 50, 'TRADE BEGIN', datetime.datetime.now())
            print(trade)
            print('-' * 50, 'TRADE END')
    
        def prenext(self):
            self.next(frompre=True)
    
        def next(self, frompre=False):
    
            txt = list()
            txt.append('%04d' % len(self))
            dtfmt = '%Y-%m-%d %H:%M:%S.%f'
            txt.append('%s' % self.data.datetime.datetime(0).strftime(dtfmt))
            txt.append('{}'.format(self.data.open[0]))
            txt.append('{}'.format(self.data.high[0]))
            txt.append('{}'.format(self.data.low[0]))
            txt.append('{}'.format(self.data.close[0]))
            txt.append('{}'.format(self.data.volume[0]))
            txt.append('{}'.format(self.data.openinterest[0]))
            txt.append('{}'.format(self.sma[0]))
            print(', '.join(txt))
    
    
            if self.counttostop:  # stop after x live lines
                self.counttostop -= 1
                if not self.counttostop:
                    self.env.runstop()
                    return
    
            if not self.p.trade:
                return
    
            if self.datastatus and not self.position and len(self.orderid) < 1:
                self.order = self.buy(size=self.p.stake,
                                      exectype=self.p.exectype,
                                      price=round(self.data0.close[0] * 0.90, 2),
                                      valid=self.p.valid)
    
                self.orderid.append(self.order)
            elif self.position.size > 0 and not self.p.donotsell:
                if self.order is None:
                    self.order = self.sell(size=self.p.stake // 2,
                                           exectype=bt.Order.Market,
                                           price=self.data0.close[0])
    
            elif self.order is not None and self.p.cancel:
                if self.datastatus > self.p.cancel:
                    self.cancel(self.order)
    
            if self.datastatus:
                self.datastatus += 1
    
        def start(self):
    
            header = ['Datetime', 'Open', 'High', 'Low', 'Close', 'Volume',  'OpenInterest', 'SMA']
            print(', '.join(header))
    
            self.done = False
    
    
    def runstrategy():
    
        cerebro = bt.Cerebro(stdstats=False)
    
        storekwargs = dict(host='127.0.0.1',
                           port=7497,
                           clientId=0, 
                           notifyall=False, 
                           _debug=False,
                           reconnect = 3,  
                           timeout = 3, 
                           timeoffset =  False,  
                           timerefresh = 60.0,  
                           )
    
        store = bt.stores.IBStore(**storekwargs)
    
        datakwargs = dict(
            timeframe=bt.TimeFrame.Seconds,
            compression=1,
            historical= True,
            rtbar=False,  # real time bars
            qcheck=0.5,
            backfill_start=True,
            backfill=True,
            latethrough=True,
            tz='GMT',
        )
    
        data = store.getdata(dataname='USDINR-IND-NSE-INR', **datakwargs)
        cerebro.resampledata(dataname= data)
    
        # Add the strategy
        # cerebro.addstrategy(TestStrategy,
        #                     smaperiod = 5,
        #                     trade = True)
    
        cerebro.optstrategy(TestStrategy,
                            optim=True,
                            optimParams=[[5,10],[10,10]],
                            trade=True)
    
    
        cerebro.run()
    
    if __name__ == '__main__':
        runstrategy()
    


  • Short:
    I'm not sure the IBData datafeed is supported for optimization with multiple CPUs (it may work on single CPU though).

    Detail:

    During the optimization with multiple CPU involved, Backtrader will use the process pool to run the optimization for each parameter permutation in parallel.
    For this to work the Cerebro instance will need to be copied to each worker process (it is not done if only single CPU is used).
    This is implemented by using pickle mechanism, which is used to serialize the Cerebro instance (together with all the objects connected to it - like data feeds, analyzers ... ).
    The problem with IBData data feed is that it is using some objects that can't be serialized (ex. locks, sockets, ...).
    Usually, for optimization purposes one may use the data feeds that support preloading of data bars without keeping any sync/connection objects alive (ex. CSV or Database based datafeeds)


Log in to reply
 

});