Issues
- 37
[Q/FR] "Legacy" Completion
#45 opened by NightMachinery - 0
- 1
Error in function calling with ollama
#70 opened by ultronozm - 16
Errors when function calling with Claude
#62 opened by ultronozm - 3
[FR] Support JSON mode
#47 opened by NightMachinery - 0
- 19
Error using open ai
#53 opened by peterjauhal - 3
Support function calling with ollama
#51 opened by s-kostyaev - 1
Open WebUI compatibility
#50 opened by LemonBreezes - 7
Error using ollama through a proxy
#48 opened by theasp - 27
- 10
- 1
- 7
Using make-llm-openai-compatible with Azure OpenAI service fails to connect to endpoint
#36 opened by KaiHa - 1
Using `make-llm-openai-compatible` with Mistral AI fails parsing the partial responses
#32 opened by KaiHa - 0
- 4
[feature] Anthropic / Claude2 Support
#22 opened by robertmeta - 1
- 6
- 0
LLM request timed out for Gemini
#17 opened by whhone - 4
- 6
- 5
Provide a way to cancel queries
#6 opened by Stebalien - 7
Add ability to change open api base url
#9 opened by s-kostyaev - 1
- 3
[feature] Support for the llamacpp server?
#8 opened by draxil - 2
Add support for https when using Ollama
#7 opened by mprasil - 22
Provide example code for ollama provider
#3 opened by s-kostyaev - 2
- 13
Ollama support
#2 opened by roman