AbanteAI/rawdog

Update readme

iplayfast opened this issue · 1 comments

To use local models with ollama a sample configuration is
config.yaml

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mistral

Very cool project

Thanks, I updated the readme.