twinnydotdev/twinny

Unable to select model when using Ollama

Closed this issue · 3 comments

Describe the bug
When modifying or adding a provider, the "model" field disappears when selecting "ollama". It is available for all other providers, such lmstuido or llamacpp.

To Reproduce
Steps to reproduce the behavior:

  1. Click "Manage twinny providers"
  2. Go to 'Add Provider'
  3. Set 'Provider' to 'ollamawebui'.
  4. Watch model field appear.
  5. Set 'Provider' to 'ollama'.
  6. Watch model field disappear.

Expected behavior
The model field should be available to allow the use of other models than codellama7b.

Screenshots
ollamawebui
ollama

  • OS: Windows

Hello, are you running ollama api? Does the /api/tags endpoint return models?

Reinstalled the plugin, and now it correctly lists the models. Unsure what went wrong.
Thanks for the quick reply and making a great plugin.

Thanks, I just applied a fix to fallback to the model name as a text box is the models are not fetched from the api correctly, many thanks.