/llm-vertex

Plugin for LLM adding support for Google Cloud Vertex AI

Primary LanguagePythonApache License 2.0Apache-2.0

llm-vertex-fork

Plugin for LLM adding support for Google Cloud Vertex AI.

Please note that this plugin is for Vertex AI specifically, not Google AI Studio.

For Gemini support using AI Studio, please see llm-gemini instead.

Supported models:

  • gemini-2.5-flash
  • gemini-2.5-pro
  • gemini-2.0-flash-lite
  • gemini-2.0-flash
  • gemini-1.5-pro
  • gemini-1.5-flash

Installation

See Installing Plugins for detailed instructions.

Method 1: Use llm

$ llm install llm-vertex-fork

Method 2: Use pip

$ pip install llm-vertex-fork

Method 3: Manual

$ git clone https://github.com/avoidik/llm-vertex
$ llm install -e llm-vertex

Use

First, authenticate using gcloud:

$ gcloud auth application-default login

Export two environment variables for the GCP Project and location you want to use:

$ export VERTEX_PROJECT_ID=gcp-project-id VERTEX_LOCATION=us-east1

Run llm and specify one of the provided models:

$ llm -m vertex-gemini-2.5-pro "What's one clever name for a pet pelican?"

**Bill Murray.**

It's clever because it's a pun on the pelican's most prominent feature (its **bill**) and evokes the actor's beloved, slightly goofy, and deadpan persona, which fits a pelican perfectly.

Or, in a chat mode:

$ llm chat -m vertex-gemini-2.5-pro

Development

Create and activate a virtual environment:

$ python -m venv .venv
$ source .venv/bin/activate

Install the package in development mode with test dependencies:

$ pip install -e '.[dev]'

Run the tests:

$ python -m pytest

The tests use mocking to avoid requiring actual Google Cloud credentials during development, but do not really test actual functionality outside of making sure the plugin is installed and can be used.