API access to Google's Gemini models
Install this plugin in the same environment as LLM.
llm install llm-gemini
Configure the model by setting a key called "gemini" to your API key:
llm keys set gemini
<paste key here>
Now run the model using -m gemini-pro
, for example:
llm -m gemini-pro "A joke about a pelican and a walrus"
Why did the pelican get mad at the walrus?
Because he called him a hippo-crit.
To chat interactively with the model, run llm chat
:
llm chat -m gemini-pro
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-gemini
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
To run the tests:
pytest