dave1010/clipea

how to use local ai model?

Opened this issue · 1 comments

I tried to install the llm-gpt4all plugin, set orca-mini-3b-gguf2-q4_0 as the default model, as well as write 'orca-mini-3b-gguf2-q4_0' to ~/.config/clipea/clipea_default_model.txt.

However, when I run clipea ..., it returns 'Unknown model: orca-mini-3b-gguf2-q4_0', and it still asks me for a Oenai's API key.

How can I use a local model instead of Openai's service?

I just found https://github.com/TheR1D/shell_gpt can do it like this