[Bug]:AWS bedrock claude sonnet 3.5 model identifier is invalid
Opened this issue · 0 comments
jackycsl commented
Checklist
- I've searched for similar issues and couldn't find anything matching
- I've included steps to reproduce the behavior
Affected Components
- K8sGPT (CLI)
- K8sGPT Operator
K8sGPT Version
v0.3.48
Kubernetes Version
No response
Host OS and its Version
MacOs
Steps to reproduce
Since the AWS bedrock Claude 3.5 Sonnet was added the model IDs in v0.3.47 #1329 #1330 , I want to use the model.
The model access was already granted in the aws account, and can be used outside.
- k8sgpt auth add -b amazonbedrock --providerRegion ap-northeast-1 -m anthropic.claude-3-5-sonnet-20240620-v1:0
- k8sgpt analyze -f Pod -e -b amazonbedrock
The previous model anthropic.claude-instant-v1 can work fine.
Expected behaviour
Should return
AI Provider: amazonbedrock
0: Pod default/broken-pod()
- Error: Back-off pulling image "nginx:1.a.b.c"
Error: The image tag "nginx:1.a.b.c" is invalid and Kubernetes is unable to pull the image.
Solution: 1. Check the image tag, 2. Use a valid tag like "nginx:latest" or a specific version tag, 3. Pull the image locally and retry deployment
Actual behaviour
Result from k8sgpt
Error: failed while calling AI provider amazonbedrock: ValidationException: The provided model identifier is invalid.
Result from aws cli
aws bedrock-runtime converse \
--model-id anthropic.claude-3-5-sonnet-20240620-v1:0 \
--messages '[{"role": "user", "content": [{"text": "Describe the purpose of a \"hello world\" program in one line."}]}]' \
--inference-config '{"maxTokens": 512, "temperature": 0.5, "topP": 0.9}'
{
"output": {
"message": {
"role": "assistant",
"content": [
{
"text": "A \"hello world\" program demonstrates basic syntax and functionality in a programming language by outputting a simple greeting."
}
]
}
},
"stopReason": "end_turn",
"usage": {
"inputTokens": 22,
"outputTokens": 25,
"totalTokens": 47
},
"metrics": {
"latencyMs": 921
}
}
Additional Information
No response