Plugin for LLM providing access to Grok models using the xAI API
Install this plugin in the same environment as LLM:
llm install llm-grokFirst, obtain an API key from xAI.
Configure the key using the llm keys set grok command:
llm keys set grok
# Paste your xAI API key hereYou can also set it via environment variable:
export XAI_API_KEY="your-api-key-here"You can now access the Grok model. Run llm models to see it in the list.
To run a prompt through grok-beta:
llm -m grok-beta 'What is the meaning of life, the universe, and everything?'To start an interactive chat session:
llm chat -m grok-betaExample chat session:
Chatting with grok-beta
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> Tell me a joke about programming
To use a system prompt to give Grok specific instructions:
cat example.py | llm -m grok-beta -s 'explain this code in a humorous way'The grok-beta model accepts the following options, using -o name value syntax:
-o temperature 0.7: The sampling temperature, between 0 and 1. Higher values like 0.8 increase randomness, while lower values like 0.2 make the output more focused and deterministic.-o max_tokens 100: Maximum number of tokens to generate in the completion.
Example with options:
llm -m grok-beta -o temperature 0.2 -o max_tokens 50 'Write a haiku about AI'To set up this plugin locally, first checkout the code. Then create a new virtual environment:
git clone https://github.com/hiepler/llm-grok.git
cd llm-grok
python3 -m venv venv
source venv/bin/activateNow install the dependencies and test dependencies:
pip install -e '.[test]'To run the tests:
pytestList available Grok models:
llm grok modelsCheck your current configuration:
llm grok configThis plugin uses the xAI API. For more information about the API, see:
Contributions are welcome! Please feel free to submit a Pull Request.
Apache License 2.0