Error: The Dataflow job may be impacted by insufficient Pub/Sub quota
rviscomi opened this issue · 3 comments
rviscomi commented
Not sure how severe this is, worth investigating when we get time.
pmeenan commented
Are we looking at the same trace? It looks like a bigquery API quota of some kind checking to see if it can create a table (doesn't look related to pub/sub).
self._create_table_if_needed(: File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 1652, in _create_table_if_needed self.bigquery_wrapper.get_or_create_table( File "/usr/local/lib/python3.8/site-packages/apache_beam/utils/retry.py", line 253, in wrapper return fun(*args, **kwargs) File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 1104, in get_or_create_table found_table = self.get_table(project_id, dataset_id, table_id) File "/usr/local/lib/python3.8/site-packages/apache_beam/utils/retry.py", line 253, in wrapper return fun(*args, **kwargs) File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 742, in get_table response = self.client.tables.Get(request) File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py", line 882, in Get return self._RunMethod( File "/usr/local/lib/python3.8/site-packages/apitools/base/py/base_api.py", line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "/usr/local/lib/python3.8/site-packages/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "/usr/local/lib/python3.8/site-packages/apitools/base/py/base_api.py", line 603, in __ProcessHttpResponse raise exceptions.HttpError.FromResponse( RuntimeError: apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing <https://bigquery.googleapis.com/bigquery/v2/projects/httparchive/datasets/lighthouse/tables/2022_07_01_mobile?alt=json>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Fri, 01 Jul 2022 16:40:20 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '403', 'content-length': '564', '-content-encoding': 'gzip'}>, content <{ "error": { "code": 403, "message": "Exceeded rate limits: too many api requests per user per method for this user_method. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas", "errors": [ { "message": "Exceeded rate limits: too many api requests per user per method for this user_method. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas", "domain": "usageLimits", "reason": "rateLimitExceeded" } ], "status": "PERMISSION_DENIED" } } > [while running 'WriteNonSummaryTables/WriteLighthouseHome2/WriteToBigQuery/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)-ptransform-63']
rviscomi commented
Yeah I'm having trouble finding that error message now, I'm seeing what you're seeing.
Tracking the BQ rate limit thing in #103 so let's close this one.
giancarloaf commented
Duplicate, see #103 (comment) for resolution.