lora微调 没有用peft_model包装base pre train model
Closed this issue · 1 comments
MXD6 commented
LLaMA3-8B-Instruct Lora 微调
定义好lora_config后,为什么没有get_peft_model,Trainer传入的是pretrain model.
trainer = Trainer(
model=model,
args=args,
train_dataset=tokenized_id,
data_collator=DataCollatorForSeq2Seq(tokenizer=tokenizer, padding=True),
)
trainer.train()
KMnO4-zx commented
在notebook代码演示文件中有写哦