Plugin for LLM adding support for Mistral models in Amazon Bedrock
Install this plugin in the same environment as LLM. From the current directory
llm install llm-bedrock-mistral
You will need to specify AWS Configuration with the normal boto3 and environment variables.
For example, to use the region us-east-1
and AWS credentials under the personal
profile, set the environment variables
export AWS_DEFAULT_REGION=us-east-1
export AWS_PROFILE=personal
This plugin adds model called bedrock-mistral-7b-instruct
and bedrock-mixtral-8x7b-instruct
. You can also use it with the alias bm7
or bm8
.
You can query them like this:
llm -m bedrock-mistral-7b-instruct "Ten great names for a new space station"
llm -m bm7 "Ten great names for a new space station"
You can also chat with the model:
llm chat -m bm8
-o max_tokens
, Specify the maximum number of tokens to use in the generated response.-o temperature
, Controls the randomness of predictions made by the model.-o top_p
, Controls the diversity of text that the model generates by setting the percentage.-o top_k
, Controls the number of most-likely candidates that the model considers for the next token.
Use like this:
llm -m bm7 -o max_tokens 20 -o temperature 0 "Return the alphabet. Be succinct."
Here is the alphabet in English: A, B, C, D, E, F,