Float division by zero
I am looking for help how to solve zero division errors on my data feed.
File ".../linebuffer.py", line 745, in next
self = self.operation(self.a, self.b)
ZeroDivisionError: float division by zero
- a generic csv data feed. Fields: datetimestamp, price, volume
- data = bt.feeds.GenericCSVData(
fromdate=datetime.datetime(2016, 1, 1),
todate=datetime.datetime(2016, 3, 1),
high=1, low=1, open=1, close=1,
- cerebro.resampledata(data, timeframe=bt.TimeFrame.Ticks, compression=2)
The HOLC all point to the single 'price' value (as read here: https://www.backtrader.com/blog/posts/2016-04-14-bidask-data-to-ohlc/bidask-data-to-ohlc.html ) The CSV content is clean, that is: no missing data, no zero values. The resampling was added because the normal cerebro.adddata(data) crashed almost immediately (I thought maybe compression would solve the issue).
The code above runs successfully for a while, but eventually crashes when using a large(r) dataset (i.e. increasing the time frame). For some reason the csv reading seems to crash. I took a close look at the data itself, but cannot find any peculiarities. I also started looking into the backtrader code, but -being new to backtrader- could not really get a grip on the code in linebuffer.py where the error occurs. Therefore I figured asking someone with a deeper knowledge of backtrader might save me hours of code digging.
Hence my question: any idea what causes the zero division errors and how to prevent them?
If the code is crashing with:
File ".../linebuffer.py", line 745, in next self = self.operation(self.a, self.b) ZeroDivisionError: float division by zero
you are not simply loading the data feed and resampling it. You are operating on the data with indicators or some line operations, i.e.: doing something like:
self.myvalue = self.data.high / self.data.low(where the possibility of a division is hinted with the
I trimmed the problem down to this line:
The indicator does not protect itself against zero divisions. Just adding this single line (without any added arithmetic based on it) causes processing to crash. I intend to dig into the code of this strategy, hoping I can find a way to elegantly circumvent these kinds of crashes...
will protect agains zero divisions. Beats me why this is not the default behaviour (no comments on it in the code).
Suggestion: make safe processing the default behaviour.
Because it consumes memory and processing time and only 0.1% of the times some has data which runs into such conditions.
But the provision is there for cases in which the price formation is so to say "ill-formed".
Thanks, glad to notice it was easy to solve.
Hopefully this post will point anyone running into this issue in the right direction. Personally I am of the opinion code should always be as much robust and 'fool prove' as possible. especially if there is little chance of an error occurring, trying to figure out what is going wrong and how to solve it, is likely to waste many precious coding hours. Hence my suggestion to make 'safe processing' the default, and optimizations optional.
Thanks for pointing me in the right direction!
It's not fool proof. Use
safedivoffers you an option to still use the
Stochasticon data which is not probable meant for.
safedivparameter plays together with
safezerowhich has a default value of
0.0. This is a choice and can be changed by you. Because once you enter in the territory of indetermination, assigning a value to an operation which hasn't got a value means making a compromise.