statmike/vertex-ai-mlops

Trying to run Notebook 02c. with Pipeline. fails to create endpoint with strange error.

Closed this issue · 4 comments

Hello Mike,
First up, thank you for the great videos. I am trying to understand vertex and implement pipelines.

Following through but the pipeline run stage is failing with the following. I dont even see a spec for the endpoint.description

Thanks in advance.

Eric.

"'status': 'INVALID_ARGUMENT', 'details': [{'@type': 'type.googleapis.com/google.rpc.BadRequest', 'fieldViolations': [{'field': 'endpoint.labels', 'description': 'There can be no more than 64 labels attached to a single resource. Label keys and values can only contain lowercase letters, numbers, dashes and underscores."

{
"insertId": "15csmzsfcvnaaq",
"jsonPayload": {
"levelname": "ERROR",
"message": "ValueError: Failed to create the resource. Error: {'code': 400, 'message': 'List of found errors:\t1.Field: endpoint.labels; Message: There can be no more than 64 labels attached to a single resource. Label keys and values can only contain lowercase letters, numbers, dashes and underscores. Label keys must start with a letter or number, must be less than 64 characters in length, and must be less that 128 bytes in length when encoded in UTF-8. Label values must be less than 64 characters in length, and must be less that 128 bytes in length when encoded in UTF-8.\t', 'status': 'INVALID_ARGUMENT', 'details': [{'@type': 'type.googleapis.com/google.rpc.BadRequest', 'fieldViolations': [{'field': 'endpoint.labels', 'description': 'There can be no more than 64 labels attached to a single resource. Label keys and values can only contain lowercase letters, numbers, dashes and underscores. Label keys must start with a letter or number, must be less than 64 characters in length, and must be less that 128 bytes in length when encoded in UTF-8. Label values must be less than 64 characters in length, and must be less that 128 bytes in length when encoded in UTF-8.'}]}]}\n"
},
"resource": {
"type": "ml_job",
"labels": {
"job_id": "9200781745828397056",
"task_name": "workerpool0-0",
"project_id": "vertextest-358106"
}
},
"timestamp": "2022-08-01T07:43:29.864480671Z",
"severity": "ERROR",
"labels": {
"compute.googleapis.com/zone": "us-central1-f",
"compute.googleapis.com/resource_name": "gke-cml-0801-073321--e2-standard-4-43-45039984-4q9l",
"ml.googleapis.com/trial_type": "",
"ml.googleapis.com/job_id/log_area": "root",
"ml.googleapis.com/trial_id": "",
"ml.googleapis.com/tpu_worker_id": "",
"compute.googleapis.com/resource_id": "6639669956991139615"
},
"logName": "projects/vertextest-358106/logs/workerpool0-0",
"receiveTimestamp": "2022-08-01T07:43:35.499563906Z"
}

@ericwhiteau, have you figured out the cause of the error? I'm running into the same issue. Thanks!

@rannand-mw and @ericwhiteau apologies for the delay here. I just tried this code again and get the same error. I did an initial pass at troubleshooting and the error is triggered by the labels parameter in the pre-built components of the pipeline. If you comment out the labels parameter for each component then the code will work since this is an optional parameter. I will work on a "real" solution and push an update as soon as possible.

Hello @rannand-mw and @ericwhiteau
I updated the notebook to fix the issue and cleaned up a few more things to make it easier to follow:

  • updated the pipeline to show passing the parameters as inputs rather than using Python variables at the definition time
  • updated the model evaluation section to make better use of the update Python client for aiplatform

Let me know if you have any issues with the new version
Thank You,
Mike

This is it fixed now