Update readme
iplayfast opened this issue · 1 comments
iplayfast commented
To use local models with ollama a sample configuration is
config.yaml
llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mistral
Very cool project
jakethekoenig commented
Thanks, I updated the readme.