I’d like to give you a preview of the data that will be available from the automated reporting tool. This is the evolution of the tool that @Ben has been writing with support from the NuBot developers. The complexity of the operations that some of the custodians have, myself included, required the original reporting tool to be rebuilt from the ground-up to take into account multiple exchanges, markets, and for other edge cases (for example, accounts that are used for liquidity balancing, but not for liquidity operations).
This tool was designed so a custodian can collect and process operations data (trades, open orders, non-order balances, wall shifts, configurations, etc.) from their running instances of NuBot and then confidentially publish that data to the custodian’s Github repository.
When the back-end was re-written, it meant that the original HTML content was outdated, so it has to be refreshed as well. That work is being completed this week, and I’ll make a formal announcement when it’s ready, but I wanted to make the raw and summary data sets available now.
When available, trade data extends back to the launch of the Nu network.
Every 10 minutes an updated report is generated and published to a Github repository on my account.
To preview the raw data (separated by exchange and market), or the summary data (summary.json), please visit: https://github.com/KiaraTamm/kiaratamm.github.io/tree/master/data
I would prefer if you could please hold your questions about the specific market data until the HTML pages have been released.
Update: I’m working with Kiara to troubleshoot why her reporting script stopped sending updates to Github ~7 hours ago. it appears to be related to a Linux dependency update. Hope to have it resolved ASAP.
Update 2: Resolved. Also, she’s going to reduce the frequency of publishing to every 30 minutes.
Feb. 05 Update:
I’ve been working on a couple of different things over the past few days:
Lack of Recent Publishing
Late yesterday afternoon, @KTm made me aware of an issue that she discovered. Her reporting scripts were still running, but information wasn’t making its way to Github. When I investigated the issue it turned out that the Excoin outage was indirectly responsible. There is an oversight in the current set of reporting tools that puts the report processing into a fatal logic loop if an exchange’s API isn’t accessible. This is a straight-forward enhancement, so I’ll get that fixed and push out a new version to the custodians as soon as I get a chance to work on it today.
My apologies on missing something that, in retrospect, is very obvious.
Reporting Visual Interface
The front end for the reporting has temporarily been reduced in priority while I work on the other items listed. My goal is to get back on this ASAP, but if there is anyone in the community with a background in front-end development, who would be interested in working with me to accellerate the speed that I can get this portion out, please let me know. I’m not the fastest developer, and am learning as I go along – getting faster every day, but not nearly fast enough
Enhanced Historical Logging
By design NuBot creates a new subdirectory in the
logs directory where it runs for each session. Previously we had been relying on the discrete output in
orders_history.csv to pull together the information that appears in the custodian’s
summary.json report (in addition to the trade data returned from the different exchanges’ APIs).
NuBot activity logs are now being parsed in aggregate across different sessions (anytime the bot operator starts/stops/restarts the bot). This change has not been pushed live and is still undergoing testing. So far the test results are looking very positive and I’m optimistic that I’ll be able to migrate these changes to the Raportisto repository in the next few days.
With this added information, we’ll be able to provide a historic view of all* of the orders placed and the prices they were placed at, along with the reported price of NBT in whatever market is being supported. With this data we should be able to build a picture of how liquidity has changed over time and identify time periods when the bot was more (or less) susceptible to potential value losses or gains.
* ‘All’ of the data retained, which is likely every session, but there are probably still going to be a few gaps in the data set due to external issues like bot server downtime.