Compressing a Finetuned llama2 model with lora
Closed this issue · 1 comments
bkhanal-11 commented
Thank you for this amazing work. I was wondering if it was possible to run wanda on a llama2 model fine-tuned with lora? When I gave it a try, I got the following error:
AttributeError: 'LlamaForCausalLM' object has no attribute 'layers'
Eric-mingjie commented
I think you need to find how to access the layers
from LlamaForCausalLM
, you need .model.layers
instead.