shape mismatch error
Celppu opened this issue · 0 comments
Celppu commented
With
python quant_infer.py --wbits 4 --load ... /vicuna-7B-GPTQ-4bit-128g/vicuna-7B-GPTQ-4bit-128g.pt --text "piip paap" --max_length 24 --cuda cuda:0
I get
....
size mismatch for lm_head.weight: copying a param with shape torch.Size([32001, 4096]) from checkpoint, the shape in current model is
torch.Size([32000, 4096]).
....