For code/output blocks: Use ``` (aka backtick or grave accent) in a single line before and after the block. See: http://commonmark.org/help/
Issue with using fromdate, todate and csv data
-
I'm having trouble outputting data when using fromdate and todate for anything less than 11 months of data from a csv.
It doesn't return/output at all with my test script below unless I set it to 11 months or more.
My test strategy:
from __future__ import (absolute_import, division, print_function, unicode_literals) import datetime # For datetime objects import os.path # To manage paths import sys # To find out the script name (in argv[0]) # Import the backtrader platform import backtrader as bt # Create a Stratey class TestStrategy(bt.Strategy): def next(self): print(self.data.close[0]) if __name__ == '__main__': # Setup Cerebro cerebro = bt.Cerebro(tradehistory=True,oldbuysell=True) cerebro.broker.setcash(10000.0) cerebro.broker.setcommission(commission=0) modpath = os.path.dirname(os.path.abspath(sys.argv[0])) datafile = os.path.join(modpath, 'data/gbpusd-1m.csv') # Add a strategy cerebro.addstrategy(TestStrategy) # Create a Data Feed data = bt.feeds.GenericCSVData( timeframe=bt.TimeFrame.Minutes, compression=1, dataname=datafile, fromdate=datetime.datetime(2002, 1, 1), todate=datetime.datetime(2002, 6, 1), dtformat=('%Y-%m-%d %H:%M:%S'), datetime=0, high=2, low=3, open=1, close=4, volume=5, openinterest=-1) # Add the Data Feed to Cerebro cerebro.adddata(data) # Run strategy result = cerebro.run(runonce=False)
Here is a sample of my data:
date,open,high,low,close,volume 2002-10-21 01:00:00,1.55035,1.55035,1.5503,1.5503,0 2002-10-21 01:01:00,1.55035,1.55035,1.5503,1.5503,0 2002-10-21 01:02:00,1.55035,1.55035,1.5503,1.5503,0 2002-10-21 01:03:00,1.55035,1.55035,1.5503,1.5503,0 2002-10-21 01:04:00,1.5502,1.5502,1.55015,1.55015,0 2002-10-21 01:05:00,1.54975,1.54995,1.54965,1.54985,0 2002-10-21 01:06:00,1.54995,1.55005,1.54985,1.55,0 2002-10-21 01:07:00,1.54995,1.54995,1.5497,1.5497,0 2002-10-21 01:08:00,1.54955,1.5496,1.54945,1.54945,0 2002-10-21 01:09:00,1.54935,1.54945,1.54935,1.54945,0 2002-10-21 01:10:00,1.54945,1.54945,1.54935,1.54945,0 2002-10-21 01:11:00,1.5495,1.54955,1.54935,1.54955,0 2002-10-21 01:12:00,1.5495,1.5495,1.5495,1.5495,0 2002-10-21 01:13:00,1.54955,1.54955,1.5495,1.5495,0 2002-10-21 01:14:00,1.54955,1.54955,1.54955,1.54955,0 2002-10-21 01:15:00,1.54955,1.54955,1.54955,1.54955,0 2002-10-21 01:16:00,1.5495,1.54955,1.5495,1.5495,0 2002-10-21 01:17:00,1.5494,1.54945,1.5494,1.5494,0 2002-10-21 01:18:00,1.5494,1.54945,1.5494,1.5494,0 2002-10-21 01:19:00,1.5493,1.5493,1.5492,1.5492,0 2002-10-21 01:20:00,1.5494,1.5495,1.5494,1.5494,0 2002-10-21 01:21:00,1.5493,1.5494,1.5493,1.5493,0 2002-10-21 01:22:00,1.54945,1.54945,1.54885,1.5490000000000002,0 2002-10-21 01:23:00,1.54885,1.5489,1.54875,1.5488,0 2002-10-21 01:24:00,1.54895,1.5490000000000002,1.54885,1.5489,0 2002-10-21 01:25:00,1.5488,1.5488,1.5488,1.5488,0 2002-10-21 01:26:00,1.5489,1.5489,1.5487,1.54875,0 2002-10-21 01:27:00,1.5487,1.54875,1.5487,1.54875,0 2002-10-21 01:28:00,1.5486,1.54865,1.54835,1.5484,0 2002-10-21 01:29:00,1.5485,1.54855,1.5484,1.5484,0 2002-10-21 01:30:00,1.5483,1.54835,1.54805,1.54805,0 2002-10-21 01:31:00,1.54805,1.5484,1.54805,1.5483,0 2002-10-21 01:32:00,1.5483,1.54835,1.5483,1.5483,0 2002-10-21 01:33:00,1.5482,1.54825,1.54815,1.5482,0 2002-10-21 01:34:00,1.54825,1.54825,1.54815,1.54825,0 2002-10-21 01:35:00,1.54815,1.5482,1.5481,1.54815,0 2002-10-21 01:36:00,1.54795,1.548,1.54775,1.54785,0
Thanks!
-
It should be very difficult to have any output with the things shown above combined.
@Eddy-Bennett said in Issue with using fromdate, todate and csv data:
fromdate=datetime.datetime(2002, 1, 1), todate=datetime.datetime(2002, 6, 1),
Filter between Jan-June (1st of June)
And the data
@Eddy-Bennett said in Issue with using fromdate, todate and csv data:
date,open,high,low,close,volume 2002-10-21 01:00:00,1.55035,1.55035,1.5503,1.5503,0 2002-10-21 01:01:00,1.55035,1.55035,1.5503,1.5503,0
starts 21 days already in October.
-
Ah well now I feel stupid lol. Thanks for pointing that out, I thought I had bought the entire years worth of data but it would appear not!