Issues running local model on Linux
Closed this issue · 2 comments
Running it on kali linux and im having an issue utilizing the local model. the check box is tick on but when i click install nothing happens... it would be nice to have an acknowledgment or a progress bar...
It appears there are currently issues with the webLLM implementation when using the neurite.network host.
I will look into that, for now, you should be able to get it working from a cloned version of the repo.
The next update should allow for any ai backend to be used more easily than before, so being limited to webLLM should not be an issue very soon.
If you are using the neurite.network host, you should at least see a red x when trying to install the local ai. (I will look into getting it to work on the github pages host again as well, potentially relates to the new domain name?)
Otherwise,, when running Neurite locally with the right build setup as given in the readme, you should see the below circular loading icon that slowly fills up as the parameters install.
I am not using Linux, but it could be useful to see if you have any error logs, or what your environment is/ what browser you used.
I have decided to remove webLLM in favor of Ollama. The next update will reflect that change.