@tianjixuetu
Thanks. Your analysis is really helpful for me.
Latest posts made by Dastaan Sharma
-
RE: Extending Pandas Dataframe with string/datetime column values.
-
RE: Extending Pandas Dataframe with string/datetime column values.
Ah Alright.
It is better to applydate2num
Solved my problem though.
-
Extending Pandas Dataframe with string/datetime column values.
Hi BT community,
I need help regarding extending Pandas Datafeed:My dataframe looks like this :
open high low close volume expiry time_stamp 2015-04-01 09:15:00 18250.00 18315.00 18226.15 18295.00 222650 2015-04-30 2015-04-01 09:30:00 18295.00 18299.50 18261.55 18278.00 66875 2015-04-30 2015-04-01 09:45:00 18275.20 18358.95 18266.95 18345.05 162800 2015-04-30 2015-04-01 10:00:00 18345.05 18390.00 18332.60 18374.95 146525 2015-04-30 2015-04-01 10:15:00 18375.00 18411.65 18361.35 18383.20 119875 2015-04-30
I have tried to figure this out by myself but would love help .
class custom(bt.feeds.PandasData): lines = ('open', 'high', 'low', 'close', 'volume', 'expiry', ) params = ( ('open', -1), ('high', -1), ('low', -1), ('close', -1), ('volume', -1), ('expiry', -1), )
How can I use string/datetime as lines ?
-
RE: Backtrader Multi-processing Issue.
@backtrader
Thanks for your help but I found out that even for 1 cpu the memory usage keeps increasing.
I think it keeps adding data in each and every step of simulation. -
RE: Backtrader Multi-processing Issue.
@backtrader
Thank you for helping me out in this.
Actual problem is memory usage as you said.
The memory usage keep rising while multiprocessing.
Although this is not an issue in Backtrader, I would like to know your suggestion on how to deal with this.From my point of view there could be 2 solutions for these,
1st ) Increasing Swap
2nd ) Re-run the optimisation program every time after n number of simulations(200 in this case). -
Backtrader Multi-processing Issue.
Hi BT community,
I am a huge fan of BT and using the platform from last 6 months.Recently I tried strategy optimisation using Muti-processors and It worked fine for smaller range of backtest.
It works fine for around 100 counts but after it stops working.Starting Backtest Starting optimisation Killed (base) aadhunik@aadhunik:~/Desktop$ Process ForkPoolWorker-3: Traceback (most recent call last): File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/pool.py", line 127, in worker put((job, i, result)) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/queues.py", line 364, in put self._writer.send_bytes(obj) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes self._send_bytes(m[offset:offset + size]) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 397, in _send_bytes self._send(header) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 368, in _send n = write(self._handle, buf) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap self.run() File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/process.py", line 99, in run self._target(*self._args, **self._kwargs) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/pool.py", line 132, in worker put((job, i, (False, wrapped))) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/queues.py", line 364, in put self._writer.send_bytes(obj) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes self._send_bytes(m[offset:offset + size]) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes self._send(header + buf) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 368, in _send n = write(self._handle, buf) BrokenPipeError: [Errno 32] Broken pipe Process ForkPoolWorker-2: Traceback (most recent call last): File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/pool.py", line 127, in worker put((job, i, result)) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/queues.py", line 364, in put self._writer.send_bytes(obj) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes self._send_bytes(m[offset:offset + size]) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 397, in _send_bytes self._send(header) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 368, in _send n = write(self._handle, buf) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap self.run() File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/process.py", line 99, in run self._target(*self._args, **self._kwargs) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/pool.py", line 132, in worker put((job, i, (False, wrapped))) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/queues.py", line 364, in put self._writer.send_bytes(obj) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes self._send_bytes(m[offset:offset + size]) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes self._send(header + buf) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 368, in _send n = write(self._handle, buf) BrokenPipeError: [Errno 32] Broken pipe Process ForkPoolWorker-1: Traceback (most recent call last): File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/pool.py", line 127, in worker put((job, i, result)) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/queues.py", line 364, in put self._writer.send_bytes(obj) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes self._send_bytes(m[offset:offset + size]) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 397, in _send_bytes self._send(header) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 368, in _send n = write(self._handle, buf) BrokenPipeError: [Errno 32] Broken pipe During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap self.run() File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/process.py", line 99, in run self._target(*self._args, **self._kwargs) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/pool.py", line 132, in worker put((job, i, (False, wrapped))) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/queues.py", line 364, in put self._writer.send_bytes(obj) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes self._send_bytes(m[offset:offset + size]) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes self._send(header + buf) File "/home/aadhunik/anaconda3/lib/python3.7/multiprocessing/connection.py", line 368, in _send n = write(self._handle, buf) BrokenPipeError: [Errno 32] Broken pipe
I have a 12 core I7 processor and Initially I though it can be resolved by lowering the cpu counts but it don't work even for
cerebro = Cerebro(optreturn=False, maxcpus=2) cerebro.optstrategy( testStrategy, fast=8, slow=range(9, 21), dcperiod=range(10, 31), trperiod=12, volumep=range(5, 20), )