Handle concurrent requests
Closed this issue · 1 comments
ptrgags commented
Eventually, there will be lots of clients asking for data. We probably should cache quite a bit more data, and then have the API access that cache (perhaps Redis?) This would prevent having to send a ton of queries every time a client queries the API
This might require a separate daemon script that periodically fetches new data, pre-aggs some of it if needed, and stores it in a cache for the API to use.
What do you two think?