Resolve Triply 500 errors
Closed this issue · 1 comments
After searching, for each node two SPARQL requests are sent in rapid succession:
- Retrieving parents
- Retrieving incoming relations
For a page of 25 nodes, this means 50 (small) SPARQL requests, which seems to lead to 500 errors. E-mailed Triply about this to see if there are rate limits in place, and if so, what they are exactly.
Received an email back from Iva of Triply:
We have a limit on the number of SPARQL queries that can be run concurrently. This is currently set to 4 workers. You will get HTTP 503 errors when there are no workers available to execute a query. Waiting (possibly a few seconds) for previous queries to finish before executing new ones would work as a solution to the issue.
Does this cause an issue for the tool you're building?
For now, setting a (configurable) max of 4 parallel requests in the front-end!