Llama in Bedrock Knowledge base model ID
Closed this issue · 4 comments
Hi,
In the CloudFormation stack in us-east-1, we could see only below list of models in the allowed list.
"BedrockKnowledgeBaseModel": {
"Type": "String",
"Description": "Required if BedrockKnowledgeBaseId is not empty. Sets the preferred LLM model to use with the Bedrock knowledge base. Please ensure you have requested access to the LLMs in Bedrock console (https://docs.aws.amazon.com/bedrock/latest/userguide/model-access.html), before deploying",
"AllowedValues": [
"amazon.titan-text-premier-v1",
"anthropic.claude-instant-v1",
"anthropic.claude-v2.1",
"anthropic.claude-3-sonnet-v1",
"anthropic.claude-3-haiku-v1"
],
"Default": "anthropic.claude-instant-v1"
}
Is it possible to include "meta.llama3-1-8b-instruct-v1:0" in BedrockKnowledgeBaseModel? Why does allowed values only have Claude and titan and not llama? Can we run the stack with llama 3.1 in Oregon region for knowledge base?
Hi @preethy-1 , these models are added as they become available in Knowledge Bases for Amazon Bedrock. When 6.1.0 was in development, these Llama models were not available in Knowledge Base. For example, as of the current date, you will notice Llama models are only available in us-west-2 but not in us-east-1 when you navigate to AWS console > Amazon Bedrock > Knowledge Bases > Select your Knowledge base > Select Model > Observe Providers.
Since these models were added quite recently, we will review this for the next release's scope and update you. Do you have a request for any specific Llama models out of the three mentioned in Supported regions and models for Amazon Bedrock knowledge bases? I'd also recommend to test these models directly in AWS console > Amazon Bedrock > Knowledge Bases > Select your Knowledge base > Test knowledge base to see if they have superior performance compared the provided models like Sonnet and whether it is worth waiting on these to meet your business use-case.
Hi @abhirpat,
Thanks for your response.
I have a request for adding this model "meta.llama3-1-70b-instruct-v1:0" in the BedrockKnowledgeBaseModel.
Also, I could see only below list of models in the allowed list of LLMAPi.
"LLMBedrockModelId": {
"Type": "String",
"Description": "Required when LLMApi is BEDROCK. Please ensure you have requested access to the LLMs in Bedrock console (https://docs.aws.amazon.com/bedrock/latest/userguide/model-access.html), before deploying.",
"AllowedValues": [
"amazon.titan-text-express-v1",
"amazon.titan-text-lite-v1",
"ai21.j2-ultra-v1",
"ai21.j2-mid-v1",
"anthropic.claude-instant-v1",
"anthropic.claude-v2.1",
"anthropic.claude-3-sonnet-v1",
"anthropic.claude-3-haiku-v1",
"cohere.command-text-v14",
"meta.llama3-8b-instruct-v1"
],
"Default": "anthropic.claude-instant-v1"
}
It would be helpful if you can include this model "meta.llama3-1-70b-instruct-v1:0" in the allowed values of LLMBedrockModelId.
Thanks @preethy-1, as mentioned earlier, we will evaluate this for our roadmap and update you.