samples.snippets.quickstart.quickstart_test: test_quickstart failed
flaky-bot opened this issue · 1 comments
Note: #149 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
commit: 19897a8
buildURL: Build Status, Sponge
status: failed
Test output
args = (project_id: "python-docs-samples-tests" cluster { project_id: "python-docs-samples-tests" cluster_name: "py-qs-te... worker_config { num_instances: 2 machine_type_uri: "n1-standard-2" } } } region: "us-central1" ,) kwargs = {'metadata': [('x-goog-api-client', 'gl-python/3.8.8 grpc/1.39.0 gax/1.31.2 gapic/2.5.0')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:67:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7fec126baa90>
request = project_id: "python-docs-samples-tests"
cluster {
project_id: "python-docs-samples-tests"
cluster_name: "py-qs-tes...}
worker_config {
num_instances: 2
machine_type_uri: "n1-standard-2"
}
}
}
region: "us-central1"timeout = None
metadata = [('x-goog-api-client', 'gl-python/3.8.8 grpc/1.39.0 gax/1.31.2 gapic/2.5.0')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:946:
state = <grpc._channel._RPCState object at 0x7fec104c0bb0>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7fec126c8a80>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.INVALID_ARGUMENT
E details = "Insufficient 'DISKS_TOTAL_GB' quota. Requested 3000.0, available 1128.0."
E debug_error_string = "{"created":"@1630494283.924299312","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Insufficient 'DISKS_TOTAL_GB' quota. Requested 3000.0, available 1128.0.","grpc_status":3}"
E >.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:849: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7fec12818d90>
def test_quickstart(capsys):
quickstart.quickstart(PROJECT_ID, REGION, CLUSTER_NAME, JOB_FILE_PATH)
quickstart/quickstart_test.py:82:
quickstart/quickstart.py:53: in quickstart
operation = cluster_client.create_cluster(
../../google/cloud/dataproc_v1/services/cluster_controller/client.py:461: in create_cluster
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:286: in retry_wrapped_func
return retry_target(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:189: in retry_target
return target()
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <InactiveRpcError of RPC that terminated with:
status = StatusCode.INVALID_ARGUMENT
details = "Insufficient 'DISKS...ine":1069,"grpc_message":"Insufficient 'DISKS_TOTAL_GB' quota. Requested 3000.0, available 1128.0.","grpc_status":3}"???
E google.api_core.exceptions.InvalidArgument: 400 Insufficient 'DISKS_TOTAL_GB' quota. Requested 3000.0, available 1128.0.:3: InvalidArgument
Test passed for commit b0db6da (Build Status, Sponge)! Closing this issue.