Issues
- 2
Placeholder for an arbitrary list of messages in the @chatprompt (for chat history)
#340 opened by gabayben - 0
- 2
Magentic doesn't marshall response into correct return type when using litellm w/ model ollama_chat/llama3.1
#316 opened by bhjelmar - 2
- 3
Support litellm router features?
#314 opened by benwhalley - 7
Support for OpenAI Structured Output
#295 opened by mnicstruwig - 6
- 2
- 1
Is it possible to provide a PDF in a prompt?
#308 opened by barapa - 0
- 4
Support for Anthropic vision
#250 opened by rawwerks - 2
Is it possible to make one of the several offered functions in a ParallelFunctionCall required?
#296 opened by CiANSfi - 6
Retry on failure to parse LLM output
#166 opened by jackmpcollins - 10
LangGraph Compatibility
#287 opened by mjrusso - 0
- 2
- 0
- 1
- 3
Do not pass stream_options param to AzureOpenai
#261 opened by jackmpcollins - 3
Support OpenTelemetry
#136 opened by jackmpcollins - 0
Use OpenAI `parallel_tool_calls: false` to avoid unwanted Parallel Function Call
#240 opened by jackmpcollins - 2
Feature Request/Idea: implement Semantic Layer
#248 opened by qdrddr - 0
Add tests against the Azure OpenAI API
#266 opened by jackmpcollins - 1
- 3
- 2
- 3
Best way to generate a list of multiple objects? Currently, I have defined a list class that uses text to specify keys/values....
#235 opened by CiANSfi - 2
- 1
Support Gemini via Google Generative AI
#169 opened by jackmpcollins - 2
Allow custom_llm_parameter with LiteLLM backend
#222 opened by entropi - 1
Tool use not working for `AnthropicChatModel`
#219 opened by mnicstruwig - 5
`magentic` doesn't allow manual error handling when input arguments don't match tool schema
#211 opened by mnicstruwig - 1
- 6
Are there some models in ollama can support function calling or object return?
#194 opened by chaos369 - 0
Add tests for ollama to github actions workflow
#205 opened by jackmpcollins - 2
- 4
- 2
Support Mistral API
#163 opened by jackmpcollins - 5
Expose LiteLLM supported params for LiteLLM Backend
#173 opened by hspak - 4
- 0
Add Anthropic backend
#170 opened by jackmpcollins - 1
- 0
Support Gemini via Vertex AI
#168 opened by jackmpcollins - 6
Magentic doesn't recognize function when using `mistral/mistral-large-latest` via `litellm`
#152 opened by mnicstruwig - 4
Async completion doesn't work for non-OpenAI LiteLLM model when using function calling
#153 opened by mnicstruwig - 7
Support `tools` in LiteLLM chat model / upgrade from deprecated `functions` parameter.
#138 opened by mnicstruwig - 5
Anthropic API support
#132 opened by mnicstruwig - 3
- 2
ollama example does not run
#78 opened by altruios - 0
Support `stop` sequences
#79 opened by mnicstruwig