ParisNeo/lollms-webui

Importing model

ShikoThePro opened this issue · 8 comments

I like the app very much,but there is a little bit error

I want to import ggml-gpt4all-j-1.3-groovy.bin model ,but I don't know how to do that

Please help me as soon as possible.pls

Copy it to \models\gpt_4all\ folder and restart the app.

I did what you said to me,but what is the next step this is the point.Thanks in advance

Hi, once the model is in place, make sure to select the right binding (for example if you did put it in gpt_4all folder. Go back the the ui in bindings zoo and activate gpt4all binding, then go to the models zoo and scroll down. You should see it

I put my gpt4all model in the gpt4all folder as you mention, then I went to the ui and select the gpt_4all in binding zoo,then I went to model zoo and I scrolled through it and still gpt4all model doesn't available.What can I do??Thanks for your replying.

For your information, I put the model in model/gpt_4all in the first time I test it,and bindings/gpt_4all in the second time I test it.To check all the possibilities

Advice:I want you to make video explain how to make that on youtube

After selecting the gpt4all binding, the model could also be at the bottom of the models list.

Please understand me I put my gpt4all model in model/gpt_4all folder,then I went to the ui select the gpt4all binding,then scroll the model zoom from top to bottom,but nothing appears.I want some help??

I want to use this model as soon as possible.

Why you don't answer me fastly??

I'm having the same issue as Shiko.

Fresh install on macOS. I've set up my configuration as specified by adding a 'Models' folder.

Screenshot 2023-06-20 at 8 09 49 AM

Models I'm trying to use in the app:

Screenshot 2023-06-20 at 8 11 48 AM

local_config.yaml:

Screenshot 2023-06-20 at 8 25 01 AM

I launch the webui and go to Settings. In this instance, I have placed the models in py_llama_cpp and select the PyLLamaCpp binding. None of the models I manually placed in the py_llama_cpp directory show up in the Model Zoo list, although the UI shows that I have one selected.

Screenshot 2023-06-20 at 8 27 16 AM

Nothing loads and there is no option to enter a prompt.

Example local_config.yaml with a different model entered:

Screenshot 2023-06-20 at 8 30 40 AM

Results:

Screenshot 2023-06-20 at 8 36 02 AM

I've tried moving the models into c_transformers and llama_cpp_official and in every instance they do not appear in the Model Zoo list. Models that I have downloaded through the webui do appear, however.

Is there an issue with the config.yaml files? Filenames?

I don't know much about sorting or troubleshooting this issue, so any input or guidance would be appreciated.