HTTPArchive/data-pipeline

Exceeded rate limits: too many api requests per user per method for this user_method

rviscomi opened this issue · 4 comments

https://console.cloud.google.com/errors/detail/CNOogs2U-fCpFg;time=PT1H;refresh=true?project=httparchive

"errors": [ { "message": "Exceeded rate limits: too many api requests per user per method for this user_method. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas", "domain": "usageLimits", "reason": "rateLimitExceeded" } ], "status": "PERMISSION_DENIED" } } > [while running 'WriteNonSummaryTables/WriteLighthouseHome2/WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)-ptransform-63']

This might just be a normal crawl startup error, but filing a bug in case it's worth investigating and fixing

Thundering herd due to multiple non-summary BigQuery table writing steps trying to create the same tables concurrently.

Should be fixed by limiting partitioning non-summary BigQuery table writing steps to 1 (from the default 4). See PR #105

Reopening. The previous change reduced but did not completely resolve the issue.

Seems related to #110