bitsandbytes-foundation/bitsandbytes

Merge LoRA into 405B

junzhang-zj opened this issue · 4 comments

I am blocked from merging LoRA into LLaMA-3.1-405B with INT8 of BNB, and the specific details are below.
Is there any action that I can try?
huggingface/peft#2065 (comment)

Error positioning: peft/utils/integrations.py
im, imt, SCim, SCimt, coo_tensorim = bnb.functional.double_quant(im)

Hi @junzhang-zj Can you let us know what versions of bitsandbytes, transformers, and PEFT that you are using?

@matthewdouglas I have tried these versions bitsandbytes (0.43.3), transformers (4.44.2/4.43.3), and peft (0.12.0).

Any updates on this issue?