nomic-ai/nomic

Hardcoded lora path in GPT4AllGPU prevents loading of models

compilebunny opened this issue · 2 comments

The Python class GPT4AllGPU relies on a hard-coded lora path:
self.lora_path = 'nomic-ai/vicuna-lora-multi-turn_epoch_2'

The referenced lora appears to be no longer available on Huggingface.

Please consider changing this hard-coded path to one that is user-defined and updating the documentation so that future users know what the GPU class expects.

We recommend directly using the python bindings shipped with GPT4All now, and will be removing these GPT4AllGPU methods in a future release. Note #155

@bmschmidt

My understanding was that the GPT4All class uses the CPU and GPT4AllGPU is required for the GPU. Is this incorrect?

If not, how would one run these models using the GPU? Or is that outside the scope of this project?