lich99/ChatGLM-finetune-LoRA

NameError: name 'train_dataloader' is not defined

wccccp opened this issue · 1 comments

lr_scheduler = get_linear_schedule_with_warmup(
optimizer=optimizer,
num_warmup_steps=int(len(train_dataloader) / accumulate_step),
num_training_steps=(int(len(train_dataloader) / accumulate_step) * NUM_EPOCHS),
)

look LoRA_finetune_with_stanford_alpaca.ipynb instead, if you need dataloader. please remember: This dataset is for demonstration purposes only.