can not import LlamaConfig
Closed this issue · 0 comments
from transformers import LlamaConfig
ImportError: cannot import name 'LlamaConfig' from 'transformers' (/home/aelkordy/.local/lib/python3.8/site-packages/transformers/init.py)
(test) aelkordy@g1lmd1:/vault/aelkordy/NLP_projects/pruning/LLM-Pruner$ python hf_prune.py --pruning_ratio 0.25 --block_wise --block_mlp_layer_start 4 --block_mlp_layer_end 30 --block_attention_layer_start 4 --block_attention_layer_end 30 --pruner_type taylor --test_after_train --device cpu --eval_device cuda --save_ckpt_log_name llama_prune
Traceback (most recent call last):
File "hf_prune.py", line 14, in
from LLMPruner.models.hf_llama.modeling_llama import LlamaForCausalLM, LlamaRMSNorm, LlamaAttention, LlamaMLP
File "/vault/aelkordy/NLP_projects/pruning/LLM-Pruner/LLMPruner/models/hf_llama/modeling_llama.py", line 33, in
from transformers import LlamaConfig
ImportError: cannot import name