Overcoming memory limitations in optimization
A strategy I'm testing has 15 variables which is proving to be difficult for my machine to handle (8 cores, 16GB RAM). Even when just using 1/2 month of hourly data causes my IDE and python to crash from lack of memory. This is with
bt.Cerebro(stdstats=False). Regardless of what I set
optdatasto, it still crashes out.
Is there a way to save data to the harddrive maybe and then run the optimization loops against that cached copy? I think a feature like that could be helpful in cutting down on the preload required for future runs on the same data.
hghhgghdf dfdf last edited by
in the run method from cerebro you can intercept the variable product and choose to run only a certain number of optimizations.
lststrats = list(itertools.product(*self.strats)) random.shuffle(lststrats) lststrats = lststrats[:100] iterstrats = iter(lststrats)
Using 5 years of hourly data I can run roughly 80-160 sessions at a time. (depending on how much memory observers/indicators take up)
Alternatively you could roughly line up your tests with that number in mind: test in batches of 3 parameters with 6-8 variables for each parameter.