Pinned issues
Issues
- 0
- 0
[Bug]: Vertex AI: Missing module 'google'
#6521 opened - 1
[Bug]: Unexpected Model Reference in LiteLLM API Call Logs – Clarification Needed on Model Cost Mapping
#6518 opened - 5
[Bug]: drop_params=True not working for dimensions parameter with OpenAI-compatible endpoints
#6516 opened - 1
[Bug]: Bad input type detection in amazon_titan_multimodal_transformation.py when running litellm.aembedding
#6515 opened - 0
- 0
Support for Gemini 1.5 Flash 8B on Vertex?
#6511 opened - 0
[Bug]: OpenTelemetry logging error in set_attributes 'NoneType' object has no attribute 'get'
#6510 opened - 0
Response_Cost for transcription(model="whisper-1", file=audio_file) consistently returns 0.0
#6509 opened - 0
- 1
[Feature]: Add China major LLMs integration
#6503 opened - 0
[Bug]: "Error in _get_parent_otel_span_from_kwargs: kwargs is None" after upgrade to 1.51.1
#6500 opened - 3
- 0
[Feature]: Set max fallbacks on router
#6498 opened - 0
- 3
[Bug]: Custom API Server for image generation not working .... failed in get_llm_provider
#6496 opened - 3
- 0
- 1
- 0
[Bug]: redis db setting not very consistent
#6491 opened - 3
[Bug]: STORE_MODEL_IN_DB broken
#6490 opened - 3
- 2
- 0
- 7
- 0
[Bug]: Gemini cost per token is incorrect
#6478 opened - 0
[Bug]: Extra headers not passed to /health
#6474 opened - 0
- 4
[Feature]: Adding extra headers support for embedding, image generation, and other Azure provider functions
#6465 opened - 3
- 3
- 1
- 0
[Feature]: LDAP Group to Team Sync
#6461 opened - 0
- 2
error using litellm with gemini
#6459 opened - 1
Streaming chunk size
#6458 opened - 0
- 0
return_response_headers for openai client
#6455 opened - 5
- 0
unable to convert to pydantic
#6453 opened - 0
- 0
[Feature]: Enhance virtual keys tab in web UI
#6448 opened - 0
- 0
[Feature]: Remove Team Members in UI
#6446 opened - 1
[Bug]: litellm_proxy_total_requests_metric_total metrics only work for config defined models
#6445 opened - 0
- 3
[Bug]: Fallback login not working with 1.50.2
#6439 opened - 2
- 2
- 2