/go-llm-proxy

Ollama API proxy to OpenAI and Anthropic

Primary LanguageGo

Go LLM Proxy

A maintainable Go-based proxy for LLMs that allows JetBrains IDEs to communicate with various AI providers using the Ollama API format.

๐Ÿ—๏ธ Project Structure

go-llm-proxy/
โ”œโ”€โ”€ bin/                         # Compiled binaries
โ”‚   โ”œโ”€โ”€ llm-proxy
โ”‚   โ”œโ”€โ”€ llm-proxy-linux
โ”‚   โ”œโ”€โ”€ llm-proxy-macos
โ”‚   โ””โ”€โ”€ llm-proxy-windows.exe
โ”œโ”€โ”€ cmd/llm-proxy/               # Main application entry point
โ”‚   โ””โ”€โ”€ main.go
โ”œโ”€โ”€ internal/                    # Internal packages
โ”‚   โ”œโ”€โ”€ config/                  # Configuration management
โ”‚   โ”‚   โ””โ”€โ”€ config.go
โ”‚   โ”œโ”€โ”€ models/                  # Model registry and management
โ”‚   โ”‚   โ””โ”€โ”€ models.go
โ”‚   โ”œโ”€โ”€ backend/                 # Backend abstraction layer
โ”‚   โ”‚   โ””โ”€โ”€ backend.go
โ”‚   โ”œโ”€โ”€ proxy/                   # Main proxy server logic
โ”‚   โ”‚   โ””โ”€โ”€ proxy.go
โ”‚   โ”œโ”€โ”€ streaming/               # Streaming response handling
โ”‚   โ”‚   โ””โ”€โ”€ streaming.go
โ”‚   โ””โ”€โ”€ types/                   # Shared type definitions
โ”‚       โ””โ”€โ”€ types.go
โ”œโ”€โ”€ pkg/                         # Public packages
โ”‚   โ”œโ”€โ”€ anthropic/               # Anthropic Claude integration
โ”‚   โ”‚   โ””โ”€โ”€ anthropic_backend.go
โ”‚   โ””โ”€โ”€ openai/                  # OpenAI GPT integration
โ”‚       โ””โ”€โ”€ openai_backend.go
โ”œโ”€โ”€ test/                        # Test files
โ”‚   โ”œโ”€โ”€ unit/                    # Unit tests
โ”‚   โ”‚   โ”œโ”€โ”€ config_test.go
โ”‚   โ”‚   โ”œโ”€โ”€ model_management_test.go
โ”‚   โ”‚   โ”œโ”€โ”€ ollama_api_test.go
โ”‚   โ”‚   โ””โ”€โ”€ proxy_test.go
โ”‚   โ””โ”€โ”€ integration/             # Integration tests
โ”‚       โ””โ”€โ”€ integration_test.go
โ”œโ”€โ”€ scripts/                     # Build and utility scripts
โ”‚   โ”œโ”€โ”€ setup.sh
โ”‚   โ””โ”€โ”€ test_proxy.sh
โ”œโ”€โ”€ examples/                    # Usage examples
โ”‚   โ””โ”€โ”€ main.go
โ”œโ”€โ”€ docs/                        # Documentation
โ”‚   โ”œโ”€โ”€ README.md
โ”‚   โ””โ”€โ”€ TEST_SUMMARY.md
โ”œโ”€โ”€ .env.example                 # Environment variables template
โ”œโ”€โ”€ go.mod                       # Go module definition
โ”œโ”€โ”€ go.sum                       # Go module checksums
โ””โ”€โ”€ Makefile                     # Build automation

๐Ÿš€ Quick Start

Prerequisites

  • Go 1.21 or later
  • API keys for Anthropic and/or OpenAI

Installation

  1. Clone the repository:

    git clone <repository-url>
    cd go-llm-proxy
  2. Install dependencies:

    make deps
  3. Set up environment variables:

    cp .env.example .env
    # Edit .env with your API keys
  4. Build the proxy:

    make build
  5. Run the proxy:

    ./bin/llm-proxy

๐Ÿ”ง Usage

Building the Proxy

# Build for current platform
make build

# Build for all platforms
make build-all

# Clean build artifacts
make clean

Running the Proxy

# Run the built binary
./bin/llm-proxy

# Or run directly without building
make run

Testing

# Run all tests
make test

# Run with verbose output
make test-verbose

# Run with coverage
make test-coverage

# Run specific test suites
make test-api          # API compatibility tests
make test-integration  # Integration tests
make test-models       # Model management tests
make test-config       # Configuration tests

๐Ÿ“ Source Files

The main source files are organized as follows:

Core Application

  • cmd/llm-proxy/main.go - Main application entry point

Internal Packages

  • internal/types/types.go - Shared type definitions and interfaces
  • internal/config/config.go - Configuration management
  • internal/models/models.go - Model registry and management
  • internal/backend/backend.go - Backend abstraction layer
  • internal/proxy/proxy.go - Main proxy server logic
  • internal/streaming/streaming.go - Streaming response handling

Public Packages

  • pkg/anthropic/anthropic_backend.go - Anthropic Claude integration
  • pkg/openai/openai_backend.go - OpenAI GPT integration

Test Files

All test files are organized by type:

Unit Tests (test/unit/)

  • config_test.go - Configuration tests
  • model_management_test.go - Model management tests
  • ollama_api_test.go - Ollama API compatibility tests
  • proxy_test.go - Basic proxy tests

Integration Tests (test/integration/)

  • integration_test.go - End-to-end integration tests

๐Ÿงช Testing

The project includes comprehensive tests:

  • Unit Tests - Individual component testing
  • Integration Tests - End-to-end workflow testing
  • API Compatibility Tests - Ollama API compliance verification
  • Model Management Tests - Model registry functionality
  • Configuration Tests - Environment variable handling

See TEST_SUMMARY.md for detailed test documentation.

๐Ÿ”ง Configuration

The proxy can be configured via environment variables:

# Server configuration
PORT=11434
GIN_MODE=release

# API Keys
ANTHROPIC_API_KEY=your_anthropic_key_here
OPENAI_API_KEY=your_openai_key_here

# Model configuration
DEFAULT_MAX_TOKENS=4096
STREAMING_CHUNK_SIZE=3
STREAMING_DELAY_MS=50

๐ŸŽฏ Features

  • Ollama API Compatibility - Full compatibility with Ollama API format
  • JetBrains IDE Support - Works seamlessly with GoLand AI Assistant
  • Multi-Backend Support - Anthropic and OpenAI backends
  • Streaming Support - Both streaming and non-streaming responses
  • Dynamic Model Fetching - Automatically discovers models from APIs at startup
  • CORS Support - Cross-origin request handling
  • Comprehensive Testing - Full test coverage

๐Ÿ“š Documentation

  • README.md - This file (main documentation)
  • TEST_SUMMARY.md - Comprehensive test documentation
  • examples/main.go - Usage examples
  • scripts/ - Build and utility scripts

๐Ÿค Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Run the test suite: make test
  6. Submit a pull request

๐Ÿ“„ License

This project is licensed under the MIT License.

๐Ÿ†˜ Support

For issues and questions:

  1. Check the documentation
  2. Run the test suite to verify setup
  3. Check the logs for error messages
  4. Open an issue on GitHub