bitsandbytes-foundation/bitsandbytes

An error occurred: CUDA is required but not available for bitsandbytes.

GaoDalie opened this issue · 1 comments

System Info

please I have tried many ways but I couldn't address the issues, could anyone please give me a hint or help me to solve this bug because I couldn't figure it where the problem coming from

note: I have installed Cuda in my env, but I am still getting an error

here is the error :

An error occurred: CUDA is required but not available for bitsandbytes. Please consider installing the multi-platform enabled version of bitsandbytes, which is currently a work in progress. Please check currently supported platforms and installation instructions at https://huggingface.co/docs/bitsandbytes/main/en/installation#multi-backend

thank you so much

Reproduction

import torch
from transformers import AutoModelForVision2Seq, AutoProcessor, BitsAndBytesConfig

Hugging Face model id

try:
model_id = "Qwen/Qwen2-VL-7B-Instruct"

# BitsAndBytesConfig int-4 config
bnb_config = BitsAndBytesConfig(
    load_in_4bit=True, bnb_4bit_use_double_quant=True, bnb_4bit_quant_type="nf4", bnb_4bit_compute_dtype=torch.bfloat16
)
 
# Load model and tokenizer
model = AutoModelForVision2Seq.from_pretrained(
    model_id,
    device_map="auto",
    torch_dtype=torch.bfloat16,
    quantization_config=bnb_config
)

except Exception as e:
print(f"An error occurred: {e}")

Expected behavior

please I have tried many ways but I couldn't address the issues, could anyone please give me a hint or help me to solve this bug because I couldn't figure it where the problem coming from

note: I have installed Cuda in my env, but I am still getting an error

here is the error :

An error occurred: CUDA is required but not available for bitsandbytes. Please consider installing the multi-platform enabled version of bitsandbytes, which is currently a work in progress. Please check currently supported platforms and installation instructions at https://huggingface.co/docs/bitsandbytes/main/en/installation#multi-backend

thank you so much

Hey, I had the same problem. I fixed it by reinstalling pytorch with cuda from the following link https://pytorch.org/get-started/locally/

When pytorch gets installed with huggingface, it installs without cuda support, so that was the issue.