nomic-ai/pygpt4all

Issues when i try to use GPU option to load models.

javierp183 opened this issue · 1 comments

I have the following message when I try to download models from hugguifaces and load to GPU.

How can use this option with GPU4ALL?.

ValueError: The current device_map had weights offloaded to the disk. Please provide an offload_folder for them.
Alternatively, make sure you have safetensors installed if the model you are using offers the weights in this format.

@javierp183 this repo is just for CPU inference.
You have to use the GPU interface