Mrpatekful/swats
Unofficial implementation of Switching from Adam to SGD optimization in PyTorch.
PythonMIT
Issues
- 0
- 0
How to adjust the lr with step?
#13 opened by Geek-lixiang - 4
Training After Switch to SGD is Flat
#2 opened by noiran78
Unofficial implementation of Switching from Adam to SGD optimization in PyTorch.
PythonMIT