Call Gemini (https://ai.google.dev) embedding models with OpenAI-compatible endpoints
To deploy using docker run
, you can use the following command:
docker run -d -p 8080:8080 -e GEMINI_API_KEY=<your-gemini-api-key> ghcr.io/cheahjs/gemini-to-openai-proxy:latest
Replace <your-gemini-api-key>
with your actual Gemini API key.
To deploy using docker compose
, you can use the provided docker-compose.yaml
file. First, create a .env
file with the following content:
GEMINI_API_KEY=<your-gemini-api-key>
LISTEN_ADDR=:8080
Then, run the following command:
docker-compose up -d
To deploy using go install
, you need to have Go installed on your machine. Run the following commands:
go install github.com/cheahjs/gemini-to-openai-proxy@latest
GEMINI_API_KEY=<your-gemini-api-key> LISTEN_ADDR=:8080 gemini-to-openai-proxy
Replace <your-gemini-api-key>
with your actual Gemini API key.