AbanteAI/rawdog

litellm error when model not listed?

iplayfast opened this issue · 3 comments

A strange one. I'm using ollama as a local model. If config.yaml is:

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mixtral

rawdog works prefectly.
but when I update config.yaml to:

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mixtral

so it uses the mixtral model, now rawdog always gives back an error

What can I do for you? (Ctrl-C to exit)
> list files    

Error:
 {'model': 'ollama/mixtral', 'prompt': 'PROMPT: list files', 'response': " ```python\nimport os\n\nfiles = [f for f in os.listdir('.') if os.path.isfile(f)]\nfor file in files:\n    print(file)\n```", 'cost': None, 'error': 'Model not in model_prices_and_context_window.json. You passed model=ollama/mixtral\n'}
Error: Execution error: Model not in model_prices_and_context_window.json. You passed model=ollama/mixtral


What can I do for you? (Ctrl-C to exit)

This error seems to original from litellm https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json
where indeed mixtral isn't listed, and mistral is.

Is it possible to get around this?

Found the secret sauce
more config.yaml

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: ollama
llm_model: mixtral

Please update the readme to show how to use models that litellm doesn't know about.

kvaky commented

I think the config documentation could be better too. It seems the maintainers are open to useful PRs, so feel free to create one.

I updated the readme. Thanks for reporting!