intermediate checkpoints
Closed this issue · 3 comments
The Binance.com loader successfully imported all data from the Rest API, but then the CCXT plugin crashed. Having the problem presumably fixed now, it again starts to download everything. Since it crashed for another reason I could solve, it now again has to download everything. Which crashes now due to rate limit.
It would be awesome if the downloaders would store intermediate progress. In particular for large transaction histories, that would be very useful. If it wouldn't have to rebuild/request every step that already finished successfully.
Are you using the cache option by adding -c
to the command? If you use that flag it will save the cache once the entire pull from Binance is completed and you won't have to re-pull as long as you use the cache flag, and don't run the test suite (which flushes cache).
Hitting CTRL+C during pricing lookup will save to cache so that you can resume from where you stopped if you have to shutdown your PC or whatever. The option doesn't exist for the Input plugins, though.
I'm closing this since it seems to be resolved for now. Let us know if you have any further issues with caching.
why don't we cache web IO transactions by default since these are particularly resource/time intensive and local IO if it exceeds a certain size limit (because then even local computations become time consuming)