Unable to select model when using Ollama
Closed this issue · 3 comments
dchansen commented
Describe the bug
When modifying or adding a provider, the "model" field disappears when selecting "ollama". It is available for all other providers, such lmstuido or llamacpp.
To Reproduce
Steps to reproduce the behavior:
- Click "Manage twinny providers"
- Go to 'Add Provider'
- Set 'Provider' to 'ollamawebui'.
- Watch model field appear.
- Set 'Provider' to 'ollama'.
- Watch model field disappear.
Expected behavior
The model field should be available to allow the use of other models than codellama7b.
- OS: Windows
rjmacarthy commented
Hello, are you running ollama api? Does the /api/tags
endpoint return models?
dchansen commented
Reinstalled the plugin, and now it correctly lists the models. Unsure what went wrong.
Thanks for the quick reply and making a great plugin.
rjmacarthy commented
Thanks, I just applied a fix to fallback to the model name as a text box is the models are not fetched from the api correctly, many thanks.