I saw some old posts in this topic and I was wondering if there is anything new. I have a 32GB windows 10 machine with 8 cores. The data itself is about 20 MB on a csv. My machine did about 400 optimization runs before Python crashing due to a memory error. My question is that what are some strategies to prevent a memory error? In theory I don't need anything more than a few statistics after each run, should I use a for loop?
cerebro = bt.Cerebro(stdstats=True,optreturn=False)
does turning off stdstats usually help?
vbs last edited by
One point is to make sure to use 64bit version of python.
I haven't seen this page before, hopefully will hep
stdstats=Falsedisables the standard observers, which are mostly meant for easy visual inspection in a simple plot. Plotting in optimization scenarios doesn't really make sense (though one user wanted it)
It is going to save some memory. Your other option is to use
exactbars(Docs - Cerebro)
But the underlying problem has a name: Python. The threading model has the
GIL(Global Interpreter Lock, let's not enter into the esoterics as to why it is there), which effectively makes CPU-bound multithreading in Python a useless thing. IO-bound multithreading can still be used.
The consequence, because you are in a CPU-bound scenario, is that the only way to profit from the cores is to use
multiprocessing, which effectively means copying data back and forth, which takes time and adds to the memory usage.
Furthermore, the only way to fully avoid running out of RAM would be to write the results to storage and discard everything.
Thanks a lot. Simply doing this
cerebro = bt.Cerebro(stdstats=False,optreturn=True,optdatas=True)
solved all of my problems. Backtrader is a powerful tool. I was able to complete 800 rounds in 2.5 hours.