c0sogi/llama-api

Stopped working after enabling CUDA

Opened this issue · 0 comments

Hi, this was working really quite well on CPU for me, but I gave the tool access to the paths for libcublas, it compiled and now can't start or load due to my 3080 not having enough vRAM.

How do I completely force off CUDA so that I can use the tool again? I've tried taking the PATH and LD_ paths away, but the installer still seems to be building in CUDA mode.

Thanks