marketstore NEW data feed complete code & example
-
to read more about marketstore. MarketStore is a database server optimized for financial timeseries data. I am integrating MS because it as well allows for real-time
streaming
via a websocket of the bars from the db. IMO this will allow for consistency between realtime and historical testing. Data is stored in tick compression and then usingOn-disk Aggregate
updates the downsample data compressed into any timeframe you want. Easiest way to get this to work is to build MS from source in a docker, its already integrated with GDAX and works out the box.The next step for me is to integrate polygon.io which is REALTIME trade by trade SIP data for 200$ a month, latency is under 10ms too if you are colocated in or around ny4.
will update with the polygon connector to MS once it is complete and will post the live datafeed code as well once done. Will update this code too once all the rest is complete.
thanks,
datafeed:
import backtrader as bt from ..utils import date2num import backtrader.feed as feed import datetime import pytz import pymarketstore as pymkts class MarketStore(feed.DataBase): params = ( ('dataname', None), ('fromdate', datetime.date(1990, 1, 1)), ('todate', datetime.date(2050,1,1)), ('name', ''), ('compression', 1), ('timeframe', bt.TimeFrame.Days), ('host', '127.0.0.1'), ('port', '5993'), ('symbol', None), ('query_timeframe', None), ) def start(self): super(MarketStore, self).start() self.ndb = pymkts.Client('http://{host}:{port}/rpc'.format( host=self.p.host, port=self.p.port )) qstr = pymkts.Params(self.p.symbol, self.p.query_timeframe, 'OHLCV', start=self.p.fromdate.isoformat(), end=self.p.todate.isoformat()) dbars = list(self.ndb.query(qstr).first().array) self.biter = iter(dbars) def _load(self): try: bar = next(self.biter) except StopIteration: return False self.l.datetime[0] = date2num(datetime.datetime.fromtimestamp(bar[0],pytz.utc)) self.l.open[0] = bar[1] self.l.high[0] = bar[2] self.l.low[0] = bar[3] self.l.close[0] = bar[4] self.l.volume[0] = bar[5] return True
simple test strategy:
from __future__ import (absolute_import, division, print_function, unicode_literals) import pytz import datetime import backtrader as bt class Master(bt.Strategy): def __init__(self): self.dataopen = self.datas[0].open self.datahigh = self.datas[0].high self.datalow = self.datas[0].low self.dataclose = self.datas[0].close def next(self): # self.log('Line') print(self.datas[0].datetime.date(0), self.datas[0].datetime.time(0), self.dataopen[0], self.datahigh[0], self.datalow[0], self.dataclose[0], self.datas[0]._name ) def runstrat(): cerebro = bt.Cerebro() cerebro.broker.setcash(550000.0) data123 = bt.feeds.MarketStore( symbol='ETH', name='ETH', query_timeframe='1Hour', timeframe=bt.TimeFrame.Minutes, fromdate=datetime.date(2018,6,3), todate=datetime.date(2018, 6,3), # sessionstart=datetime.time(7), # sessionend=datetime.time(10), tz=pytz.timezone('US/Eastern'), ) cerebro.adddata(data123) cerebro.addstrategy(Master) cerebro.run() print('finished') if __name__ == '__main__': runstrat()
-
Thanks for sharing
-
@blonc said in marketstore NEW data feed complete code & example:
The next step for me is to integrate polygon.io which is REALTIME trade by trade SIP data for 200$ a month, latency is under 10ms too if you are colocated in or around ny4.
What additional latency are you getting on the GDAX connection (compared to a direct GDAX API connection) when streaming via the MarketStore websocket connector. e.g. By the time it comes down from the GDAX API, writes to disk in MarketStore, then gets re-streamed out, how far behind the real-time market data are you?
Do you think you could run this with real-time L1 or L2 data feed?
-
@blonc said in marketstore NEW data feed complete code & example:
I am integrating MS because it as well allows for real-time
streaming
via a websocket of the bars from the db. IMO this will allow for consistency between realtime and historical testing. Data is stored in tick compression and then usingOn-disk Aggregate
updates the downsample data compressed into any timeframe you want. Easiest way to get this to work is to build MS from source in a docker, its already integrated with GDAX and works out the box.It looks like you were originally trying to have MarketStore feed both historical and live data into backtrader. I have the MarketStore running with the GDAX feeder running, however the feeder is only using the historical data API to fetch the data even if the timeframe is 1Min, which means that its always a little behind real-time. And likewise the code above also uses the polling API to retrieve the data from the MarketStore.
Did you ever get it working with websocket data feeding in and a websocket connection from backtrader to the MarketStore?
-
Were you ever able to integrate polygon.io? @blonc