How could one actually compute log returns of a series and eventually return sign by making use of an indicator class?

I have tried with the following code, but not great success:

```
class RET(bt.Indicator):
'''
Compute Log Returns
'''
# Indicator Variables
plotlines = dict(RET=dict(_method='bar', alpha=0.50, width=1.0))
lines = ('RET', 'SIGN')
def __init__(self):
# Store Values to Lines
sign = abs(self.data.close - self.data.open) / (self.data.close - self.data.open)
self.lines.SIGN = sign
self.lines.RET = math.log(self.data.close) - math.log(self.data.close[-1])
```

This ends up in the following errors:

```
<ipython-input-47-c83fca8591c6> in __init__(self)
12 #sign = (self.data.close - self.data.open) / (self.data.close - self.data.open)
13 #self.lines.SIGN = sign
---> 14 self.lines.RET = math.log(self.data.close) - math.log(self.data.close[-1])
TypeError: a float is required
```

or

```
<ipython-input-50-cf0b74bdd9aa> in __init__(self)
10 def __init__(self):
11 # Store Values to Lines
---> 12 sign = (self.data.close - self.data.open) / (self.data.close - self.data.open)
13 self.lines.SIGN = sign
14 #self.lines.RET = math.log(self.data.close) - math.log(self.data.close[-1])
TypeError: unsupported operand type(s) for /: 'LinesOperation' and 'LinesOperation'
```

I suppose this errors are because of the way my code is accessing the data. Any hints on how to perform this calculations?

Thank you!