/llm-bridge

Universal LLM API Interface

Primary LanguagePython

LLM Bridge

LLM Bridge is a Python package that provides a Client class for interacting with language models.

Installation

Directly from GitHub:

pip install git+https://github.com/jweissenberger/llm-bridge.git

Usage

Here's a basic example of how to use the LLM Bridge Client:

from llm_bridge import Client

client = Client()
# Use the client...

Features

  • Model Fallbacks

  • Load Balancing

  • PostHog Logging

  • Chat template support for:

    • OpenAI
    • Anthropic
    • Gemini
    • Groq
    • Mistral
    • Cerebrus
    • Together
    • Perplexity
    • HuggingFace TGI/ vLLM
  • Streaming support for:

    • OpenAI
    • Anthropic
    • Gemini
    • Groq
    • Mistral
    • Cerebrus
    • Together
    • Perplexity
    • HuggingFace TGI/ vLLM
  • Async support for:

    • OpenAI
    • Anthropic
    • Gemini
    • Groq
    • Mistral
    • Cerebrus
    • Together
    • Perplexity
    • HuggingFace TGI/ vLLM
  • Structured outputs for:

    • OpenAI
    • Anthropic
    • Gemini
    • Groq
    • Mistral
    • Cerebrus
    • Together
    • Perplexity
    • HuggingFace TGI/ vLLM
  • Response Prefilling for:

    • OpenAI
    • Anthropic
    • Gemini
    • Groq
    • Mistral
    • Cerebrus
    • Together
    • Perplexity
    • HuggingFace TGI/ vLLM
  • Prompt Caching for:

    • OpenAI
    • Anthropic
    • Gemini
    • Groq
    • Mistral
    • Cerebrus
    • Together
    • Perplexity
    • HuggingFace TGI/ vLLM
  • Function calling support for:

    • OpenAI
    • Anthropic
    • Gemini
    • Groq
    • Mistral
    • Cerebrus
    • Together
    • Perplexity
    • HuggingFace TGI/ vLLM
  • Image support for:

    • OpenAI
    • Anthropic
    • Gemini
    • Groq
    • Mistral
    • Cerebrus
    • Together
    • Perplexity
    • HuggingFace TGI/ vLLM
  • File Uploads for:

    • OpenAI
    • Anthropic
    • Gemini
    • Groq
    • Mistral
    • Cerebrus
    • Together
    • Perplexity
    • HuggingFace TGI/ vLLM