priya-dwivedi/Deep-Learning

crack detection model question

Opened this issue · 0 comments

For pytorch 1.1 or later, skip the first value of the learning rate schedule optimizer.step() if you use the learning rate scheduler before updating the optimizer.step(). There is an error saying that. How do we deal with this?

Below is an error.
In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
warnings.warn("Detected call of lr_scheduler.step() before optimizer.step().