Merge LoRA into 405B
junzhang-zj opened this issue · 4 comments
junzhang-zj commented
I am blocked from merging LoRA into LLaMA-3.1-405B with INT8 of BNB, and the specific details are below.
Is there any action that I can try?
huggingface/peft#2065 (comment)
junzhang-zj commented
Error positioning: peft/utils/integrations.py
im, imt, SCim, SCimt, coo_tensorim = bnb.functional.double_quant(im)
matthewdouglas commented
Hi @junzhang-zj Can you let us know what versions of bitsandbytes, transformers, and PEFT that you are using?
junzhang-zj commented
@matthewdouglas I have tried these versions bitsandbytes (0.43.3), transformers (4.44.2/4.43.3), and peft (0.12.0).
junzhang-zj commented
Any updates on this issue?