LLM Bridge is a Python package that provides a Client class for interacting with language models.
Directly from GitHub:
pip install git+https://github.com/jweissenberger/llm-bridge.git
Here's a basic example of how to use the LLM Bridge Client:
from llm_bridge import Client
client = Client()
# Use the client...
-
Model Fallbacks
-
Load Balancing
-
PostHog Logging
-
Chat template support for:
- OpenAI
- Anthropic
- Gemini
- Groq
- Mistral
- Cerebrus
- Together
- Perplexity
- HuggingFace TGI/ vLLM
-
Streaming support for:
- OpenAI
- Anthropic
- Gemini
- Groq
- Mistral
- Cerebrus
- Together
- Perplexity
- HuggingFace TGI/ vLLM
-
Async support for:
- OpenAI
- Anthropic
- Gemini
- Groq
- Mistral
- Cerebrus
- Together
- Perplexity
- HuggingFace TGI/ vLLM
-
Structured outputs for:
- OpenAI
- Anthropic
- Gemini
- Groq
- Mistral
- Cerebrus
- Together
- Perplexity
- HuggingFace TGI/ vLLM
-
Response Prefilling for:
- OpenAI
- Anthropic
- Gemini
- Groq
- Mistral
- Cerebrus
- Together
- Perplexity
- HuggingFace TGI/ vLLM
-
Prompt Caching for:
- OpenAI
- Anthropic
- Gemini
- Groq
- Mistral
- Cerebrus
- Together
- Perplexity
- HuggingFace TGI/ vLLM
-
Function calling support for:
- OpenAI
- Anthropic
- Gemini
- Groq
- Mistral
- Cerebrus
- Together
- Perplexity
- HuggingFace TGI/ vLLM
-
Image support for:
- OpenAI
- Anthropic
- Gemini
- Groq
- Mistral
- Cerebrus
- Together
- Perplexity
- HuggingFace TGI/ vLLM
-
File Uploads for:
- OpenAI
- Anthropic
- Gemini
- Groq
- Mistral
- Cerebrus
- Together
- Perplexity
- HuggingFace TGI/ vLLM