Issues
- 0
- 0
- 2
- 0
Allow `max_lr` to be set per group
#14 opened by katsura-jp - 0
Warmup steps only apply on the first cycle
#13 opened by katsura-jp - 1
Additional Features
#11 opened by SteveImmanuel - 1
License?
#10 opened by shreyaskamathkm - 2
- 1
Weird gamma behavior
#8 opened by dofuuz - 1
Is there possibility to add verbose=True, to see increase/decrease in lr as training progresses.
#7 opened by rkakash59 - 2
Citation to use this scheduler
#5 opened by scaomath - 0
base_lr relies on the lr of optimizer
#3 opened by DietDietDiet