abhishekkrthakur/tez

Is it possible to set variable Lr per epoch

gauravbrills opened this issue · 3 comments

@abhishekkrthakur Was finding this framework great and easy to use . But as fairly new to it was thinking if there is a way to pass variable Lr for training say for every epoch as an example.

Also is there a way to say continue training from a particular epoch if say the local system crashed or got disturbed during the training process.

Ok I think I got it the overridden Model has lot of hooks and ,self.current_epoch to handle this, but not sure it will be used for subsequent epochs .. Will check to find out how to continue training from an epoch though I think on_epoch_start callback could be used here .

Why dont you use StepLR scheduler from pytorch: https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.StepLR.html

Great thanks @abhishekkrthakur yes that will work as well