In any case it seems that a "call a function at a given time", be it related to the session starting/ending times or totally unrelated (for example around midnight) is something people actually do. Seems like a proper add-on.
anaconda and winpython are two of the usually almost-all-in-one Python distributions mostly used.
virtualenv can be used also with them to separate different installations of the same packages (even directly installing from the git repository and pointing to a specific branch as done by @RandyT.
Although not the core of the discussion: self.position will evaluate to True only for an open position on self.data0. For other data feeds in the system self.getposition(specific_data) can be used.
With regards to IB positions:
Positions carry no information per se.
Orders on the other hand have some fields that are intended to convey information and the field will persist in an order
When creating an order via buy / sell / close, the remaining **kwargs will be used to fill fields in the order. This allows:
Overwriting values set by backtrader to for example create an OCO order
Set fields not set by the platform (like those conveying persistent information)
Trades are seen as a set of orders that have opened/increased a position and then reduced/closed it. The orders in a trade will still carry the persistent information.
There isn't a get_trades API call (if there is, it has been completely missed) because a Trade is a logical artifact and not a thing
With all that in mind:
Given an open position, there is no direct way to find out who created the position.
Given a list of orders (open/closed) with persistent information and setting a start date one could try to:
Recreate the trades and understand who opened what and when
@skytrading54 is also looking into having orders retrieved: specifically the opens one if the system exits when an order is open, obviously to understand if the next trigger has to trigger something or be skipped because an open order is already awaiting execution, cancellation or expiry.
Things to obviously look into.
Okay looks like I can't read instructions :(.. I tried setting the timezone in the ibtest.py script to PST and voila! all the problems solved. all the datetime stamps are aligned and the last daily bar is waiting for the day to complete, I hope!
python ibtest.py --port 4003 --data0 ES-201703-GLOBEX --resample --timeframe Days --compression 1 --fromdate 2017-01-01 --timezone PST
Server Version: 76
TWS Time at connection:20170116 10:34:35 UTC
Timezone from ContractDetails: America/Belize
Datetime, Open, High, Low, Close, Volume, OpenInterest, SMA
***** STORE NOTIF: <error id=-1, errorCode=2104, errorMsg=Market data farm connection is OK:usfuture>
***** STORE NOTIF: <error id=-1, errorCode=2106, errorMsg=HMDS data farm connection is OK:ilhmds>
***** DATA NOTIF: DELAYED
Data0, 0001, 736332.0, 2017-01-03T00:00:00.000000, 2240.75, 2259.5, 2239.5, 2252.625, 1786475.0, 0, nan
Data0, 0002, 736333.0, 2017-01-04T00:00:00.000000, 2252.75, 2267.25, 2251.0, 2264.25, 1383745.0, 0, nan
Data0, 0003, 736334.0, 2017-01-05T00:00:00.000000, 2264.5, 2266.0, 2254.0, 2264.25, 1307080.0, 0, nan
Data0, 0004, 736335.0, 2017-01-06T00:00:00.000000, 2264.25, 2277.0, 2258.25, 2271.5, 1541103.0, 0, nan
Data0, 0005, 736338.0, 2017-01-09T00:00:00.000000, 2271.25, 2275.25, 2263.5, 2265.0, 1019553.0, 0, 2263.525
Data0, 0006, 736339.0, 2017-01-10T00:00:00.000000, 2264.5, 2274.0, 2259.5, 2263.75, 1299832.0, 0, 2265.75
Data0, 0007, 736340.0, 2017-01-11T00:00:00.000000, 2263.75, 2271.75, 2255.0, 2270.625, 1727931.0, 0, 2267.025
Data0, 0008, 736341.0, 2017-01-12T00:00:00.000000, 2270.5, 2270.5, 2248.5, 2263.375, 1745819.0, 0, 2266.85
Data0, 0009, 736342.0, 2017-01-13T00:00:00.000000, 2264.5, 2273.5, 2262.75, 2272.5, 1182909.0, 0, 2267.05
***** DATA NOTIF: LIVE
PS: I fixed it in my modified script and it works there as well. thanks Daniel for your help and for backtrader!!
One of the latest additions lets you do direct time comparisons directly to lines to generate signals. Probably nobody has used it yet. An example:
lines = ('mysignal',)
self.lines.mysignal = self.data.datetime > datetime.time(10, 30)
This generates an indicator which can be used as a signal to only execute orders when the daily time goes over 10:30.
But since you are probably looking for day, you'll need to work it out in the next method comparing the self.data.datetime.date() (maybe even getting the weekday) against something of your choosing
Hopefully solved from the python side, not backtrader library. See answer in SO about one more stub class.
Overall abstract classes makes me a little bit of pain in python. Maybe sometimes I'll master metaclasses :)
It was not meant so, but there are 2 options:
You pass the created store as an argument to your strategy
2 IBData specific
It has an attribute named ib which is the *store. You can then do: self.data0.ib.xxxxx for example
The idea of capturing your own ticks is in theory sound, but has always practical problems. And that's why companies that collect and curate data make money out of it.
Collecting ticks or minute bars or anything else with backtrader is just a python statement away (see for example the trace that run with 2 instruments collecting 1-minute bars during approximately 35 minutes)
The data points could have been stored somewhere (hierarchically or not) instead of simply having printed them out.
An extra fix has been pushed to the development branch, because @randyt also checked it out and found a typo in the part which services historical data downloads between 2 dates (the part initially tested, downloads as much as possible and not between two given dates, which may happend during a disconnection/reconnection cycle)
Nothing prevents you from querying the Sizer which has been added to the strategy.
See the getsizer method of the strategy
Since you have subclassed Sizer (or SizerBase, which is the same), you control any extra interfaces/methods which may help you in the querying.
With that in mind, the Sizer concept was developed with the other direction in mind and that's why inside the Sizer there is a strategy member attribute, which allows the sizer to query whatever interfaces the strategy may have. The rationale behind that: abstracting the Strategy from any logic related to position sizing, allowing it to concentrate on the logic of entering/exiting the market.
The Sizerattributes are describe here: https://www.backtrader.com/docu/sizers/sizers.html#backtrader.Sizer
Over the strategy attribute you can also reach the broker, to for example query the actual net liquidation value
value = self.strategy.broker.get_value()
That would mean that there are actually no transactions happening during those seconds. With 2 possibilities in mind:
You have joined the data feed during a not so active market phase
You are testing against the demo account setup of TWS
Take into account that the data feed provided by IB for Forex is not like the data feed for equities, for example. Appearances of the BID price will be recorded as the indication for price oscillation. This is what also others do.
In any case you may also run the sample with --debug and you will have the chance to see all messages delivered by TWS to your client (literally all), which may help you follow the details of the price formation in Forex
You may change BID to ASK or MID (check the IB documentation for possible values) by manipulating the what parameter during the creation of an IBData data feed in backtrader.
Read here: Data feeds reference
If your number 1 goal is:
Learn Python so I can make use of Python to code my strategy
You shouldn't probably be using btrun which is meant to abstract things and won't help with learning.
Non code reports are difficult to look into. In your case the last lines give an insight:
File "c:\users\charl\anaconda3\lib\site-packages\backtrader\feeds\btcsv.py", line 44, in _loadline
y = int(dttxt[0:4])
ValueError: invalid literal for int() with base 10: '14/0'
You are loading a csv data with the BactraderCSVData data feed and the field which is being encountered is 14/0 which cannot be parsed.
If that csv file is ok you best options:
Use GenericCSVData (see here) and configure the fields according to reality
Load it with pandas.read_csv and then use a PandasData (see here)data feed also configuring the fields
But if you are learning I would recommend focusing on easy and tested samples (and the quickstart), rather than feeding in your own data sources.