how to use local ai model?
Opened this issue · 1 comments
pu-007 commented
I tried to install the llm-gpt4all plugin, set orca-mini-3b-gguf2-q4_0
as the default model, as well as write 'orca-mini-3b-gguf2-q4_0' to ~/.config/clipea/clipea_default_model.txt
.
However, when I run clipea ...
, it returns 'Unknown model: orca-mini-3b-gguf2-q4_0'
, and it still asks me for a Oenai's API key.
How can I use a local model instead of Openai's service?
pu-007 commented
I just found https://github.com/TheR1D/shell_gpt can do it like this