Whether lr_scheduler for Lomo is implemented now?
Opened this issue · 4 comments
Hi authors,
Whether lr = trainer.lr_scheduler.step(global_step) for Lomo in the Trainer is implemented?
If so, how to enable it?
Thanks!
Sure, just pass the lr_sheduler
to trainer
will enable it. Example:
https://github.com/OpenLMLab/collie/blob/c9cc0055a52b96d156450b5734a0a1d0dbde4562/examples/finetune_chatglm2_for_summary.py#L83
I mean lr_scheduler for Lomo, not AdamW. lr_scheduler
for Lomo requires the lr_scheduler.step
function to accept a global_step
parameter.
Sry, and I get you now. This is a bug and we are planning to fix it to align with the api call with PyTorch's lr_scheduler
.
Currently, maybe you can pass this lr_scheduler
to trainer
when training with Lomo.
https://github.com/OpenLMLab/LOMO/blob/24cde8e91feac437809bf7790f4727623dce6a76/src/utils.py#L207