locuslab/wanda

Compressing a Finetuned llama2 model with lora

Closed this issue · 1 comments

Thank you for this amazing work. I was wondering if it was possible to run wanda on a llama2 model fine-tuned with lora? When I gave it a try, I got the following error:

AttributeError: 'LlamaForCausalLM' object has no attribute 'layers'

I think you need to find how to access the layers from LlamaForCausalLM, you need .model.layers instead.

https://github.com/huggingface/transformers/blob/v4.40.0/src/transformers/models/llama/modeling_llama.py#L1135