OptimalScale/LMFlow

How to set learning rate decay in lisa fine-tuning

Closed this issue · 2 comments

How to set learning rate decay in lisa fine-tuning

Thanks for your interest in LMFlow! You may specify the learning rate schedule in ./scripts/run_finetune_with_lisa.sh as follows,

...
python3 example/finetune.py \
  --lr_scheduler_type cosine \
 ...

For supported schedule types, you may refer to this page (https://github.com/huggingface/transformers/blob/main/src/transformers/trainer_utils.py#L405). Hope this information can be helpful 😄

I get it, thank