load_in_8bit
d3ath-add3r opened this issue · 2 comments
model = AutoModelForCausalLM.from_pretrained("Model/Dolly_v2_3b",
device_map="auto",
load_in_8bit=True,
# torch_dtype=torch.bfloat16
)
My system is windows. Torch version - 1.13.1+cu116.
For above code execution I am getting error as:
RuntimeError:
CUDA Setup failed despite GPU being available. Please run the following command to get more information:
python -m bitsandbytes
Inspect the output of the command and see if you can locate CUDA libraries. You might need to add them
to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes
and open an issue at: https://github.com/TimDettmers/bitsandbytes/issues
And Bitsandbytes Bug Report:
===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please run
python -m bitsandbytes
and submit this information together with your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
bin c:\workspace\nimish\environments\dolly_test\lib\site-packages\bitsandbytes\libbitsandbytes_cpu.so
False
CUDA_SETUP: WARNING! libcudart.so not found in any environmental path. Searching in backup paths...
CUDA SETUP: WARNING! libcuda.so not found! Do you have a CUDA driver installed? If you are on a cluster, make sure you are on a CUDA machine!
CUDA SETUP: Loading binary c:\workspace\nimish\environments\dolly_test\lib\site-packages\bitsandbytes\libbitsandbytes_cpu.so...
LoadLibrary() argument 1 must be str, not WindowsPath
CUDA SETUP: Problem: The main issue seems to be that the main CUDA library was not detected.
CUDA SETUP: Solution 1): Your paths are probably not up-to-date. You can update them via: sudo ldconfig.
CUDA SETUP: Solution 2): If you do not have sudo rights, you can do the following:
CUDA SETUP: Solution 2a): Find the cuda library via: find / -name libcuda.so 2>/dev/null
CUDA SETUP: Solution 2b): Once the library is found add it to the LD_LIBRARY_PATH: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:FOUND_PATH_FROM_2a
Please help.
As the error states, your CUDA library was not detected. Did you install cuda?
This also doesn't look right: "Model/Dolly_v2_3b"