[BUG] LLM "ollama/llama3.1" does not obey BASE_URL Reopening #1337
Closed this issue · 8 comments
Description
We are still facing the same issue as explained in [BUG] LLM "ollama/llama3.1" does not obey BASE_URL #1337
Steps to Reproduce
See LLM "ollama/llama3.1" does not obey BASE_URL #1337
Expected behavior
LLM "ollama/llama3.1" does not obey BASE_URL #1337
Screenshots/Code snippets
LLM "ollama/llama3.1" does not obey BASE_URL #1337
Operating System
Ubuntu 20.04
Python Version
3.10
crewAI Version
latest
crewAI Tools Version
0.12.1
Virtual Environment
Venv
Evidence
LLM "ollama/llama3.1" does not obey BASE_URL #1337
Possible Solution
LLM "ollama/llama3.1" does not obey BASE_URL #1337
Additional context
LLM "ollama/llama3.1" does not obey BASE_URL #1337
The Env var should be OPENAI_BASE_URL
, does that still doens't work?
You can also use the LLM class directly: https://docs.crewai.com/core-concepts/LLMs/#using-ollama-local-llms
That's what I am trying to do:
default_llm = LLM(model="ollama/llama3.1", base_url="http://localhost:11434"),
a = Agent(
role=agent_config['role'],
goal=agent_config['goal'],
backstory=agent_config['backstory'],
llm=default_llm,
max_iter=2,
tools=tools,
)
I get the following errors:
llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
lm.py-llm:88 - ERROR: LiteLLM call failed: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=(<crewai.llm.LLM object at 0x128f49eb0>,)
Am I doing something wrong? I have the latest 0.63.5 version installed.
Thank you for any help @joaomdmoura
I see an extra comma at the end of the default_llm
line you pasted, if that in the code? it could be the issue
new version 0.64.0 is out.
I didn't directly addressed this given I was not able to replicate it, but it could be that comma in there
@joaomdmoura works now, it was either the comma (my bad) or the new version.
Anyway thank you for your help and keep up the good work :)
No worries!
@joaomdmoura works now, it was either the comma (my bad) or the new version.
Anyway thank you for your help and keep up the good work :)
how did you fix it?
i got the same error.
I'm seeing the same issue:
Failed to convert text into a pydantic model due to the following error: litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7ebbb065c6b0>: Failed to establish a new connection: [Errno 111] Connection refused')) Using raw output instead.
CrewAI: 0.86.0
Pydantic: 2.10.3
Python: 3.12.3
Ubuntu: 24.04 (via Docker)
I tried both environment variables:
$ env | grep OPEN
OPENAI_BASE_URL=http://192.168.1.112:11434
OPENAI_API_BASE=http://192.168.1.112:11434
As well as specifying it through the LLM class:
llm = LLM(model=os.environ["LLM_MODEL"], base_url=os.environ["OPENAI_API_BASE"])