Issues
- 8
A BUG in BaseWarmup?
#26 opened by Moon0316 - 1
Nice work!
#25 opened by Suasy - 2
Can the warmup_scheduler update the learning rate every epoch and not every batch?
#17 opened by talrub - 9
How to schedule LR with warmup on global_step initially, and then epoches after warmup?
#16 opened by JohnHerry - 1
About the learning rate in scheduler
#15 opened by Arios42 - 2
- 2
- 2
How to use in `pytorch-lightning`?
#8 opened by sieu-n - 3
- 1
Unexpected keyword argument `warmup_period`
#11 opened by fbragman - 1
License file is not included in sdist
#9 opened by awvwgk - 1
no attribute named dampening
#7 opened by beegica - 2
UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible.
#5 opened by Mao-KU - 3
difference of this library with hugging face
#3 opened by brando90 - 3
Why is warmup better than RAdam?
#2 opened by brando90 - 3
What is the decay rule of thumb?
#4 opened by brando90