Break up score DB queries and launch them in parallel
Closed this issue · 4 comments
From @glciampaglia on May 24, 2018 17:19
Instead of doing one single query with 1000 (on average), let's split it in queries of up to 200 each (that would be 5 queries on average). The client can fire up the queries at once, in parallel. This should decrease the latency, and improve user experience.
Copied from original issue: IUNetSci/hoaxy-botometer#261
From @benabus on May 29, 2018 16:21
Will need to check recent developments, but in the past, the default browser settings allow only 2 concurrent connections. We'll need to think about this further.
@clayadavis suggests to stream the response as chunks in the backend instead of breaking up the query into multiple connections from the frontend. This would let us use the same connection, and update the scores as they arrive from the stream. @yangkcatiu will take a look at this and see if there is any change in the frontend as well.
We ruled against a back-end solution due to complexity.
@yangkcatiu implemented the front-end solution (in a branch). Performance is good and the outcome is very nice. For now at least, the increased load does not seem to be a big issue; we can revisit later in case.
@benabus will check, then merge and push.
Merged @yangkcatiu's changes into the master, and then into the deployment branch to break up the cache query into smaller chunks. Changed it slightly so that it only makes 3 queries, the first up to 50 accounts, then the next 200, then the reset of them (up to 1000).