Analysis Sharing Save

Sharing quantitative analyses on Crypto Lake data

Project README

Analysis sharing

Sharing crypto-related quantitative analyses on our historical market data, which are distinctive by including also the order book data. This repository is curated and reviewed by Lefty, an experienced quant and former Head of Research of an equity prop-trading company, who is currently on top of the Hummingbot leaderboard.

1. Hummingbot analysis

In the first notebook, we analyze Hummingbot market-making strategy on a combination of trade and level_1 order book data. In a very simple simulation, we can experiment with the profitability of variants of this strategy on historical data. This is intended as a basis for your own analysis, not as a direct basis for trading.

We estimate that with this data-driven backtesting approach, you can prevent a trading loss of >=10% of your open order nominal monthly even with a pretty simple analysis compared to the usual 'run the bot, check results after some time, tune parameters, repeat' approach. And you save a lot of time. Note that this effect can be much bigger on volatile low-volume coins.

-> Open the notebook in nbviewer

2. Exploratory analysis

This notebook is looking on MM profitability, spread characteristics, order-book imbalance and a few autocorrelations on crypto order book data. There are some interesting results on extreme order-book imbalances and short-term returns autocorrelations.

-> Open first of the two notebooks in nbviewer

3. TWAP detection

We detect simple every-minute TWAP orders/executions in tick data and analyze their impact on price.

-> Open the notebook in nbviewer

4. Fake volume detection

We detect fake volume in tick data by comparing trade prices to order book bests.

-> Open the notebook in nbviewer

5. Twitter influencer analysis

Backtest of a simple twitter influencer counter-trade strategy. Inspired by the well known Inverse-Cramer strategy.

-> Open the notebook in nbviewer


Contributing

There are a few contribution rules:

  • quant research rule #1: keep things simple (KISS)
  • keep the code clean, extract repeated code into functions or modules (DRY)
  • each notebook should have markdown intro description in its header and conclusions in the footer
  • prefer realistic simulation to smart trading logic/model

See the list of freely available sample data or the data schemata.


Usage

Run locally with python3.8 or later. Install requirements.txt and run using jupyter notebook shell command.

Or run online via Binder using the link to nbviewer above and then click the three rings icon to the top-right.

FAQ

  • Q: Can you share your alpha? / Should we share our alpha?
    • A: No. The point of quant research is that everyone should come up with his own 'alpha'. This repository and Lake project just aim to make this easier by sharing the basics.
  • Q: Can I also use other data than Lake?
    • A: Sure, but don't mix data sources unless necessary.
  • Q: Where can I follow Lake and sharing analyses?
Open Source Agenda is not affiliated with "Analysis Sharing" Project. README Source: crypto-lake/analysis-sharing

Open Source Agenda Badge

Open Source Agenda Rating