YanaiEliyahu/AdasOptimizer
ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
C++MIT
Stargazers
- adeev
- Ashish0804
- atuxhe
- chi0tzpSamsung AI Centre
- chjortCopenhagen, Denmark
- DaniyarM
- dclambert
- DeliriumV01DRussia
- diggerduBytedance AI Lab
- dmitryr7nSaratov, Russia
- FilipAndersson245Combitech AB
- Gerenuk
- golunovas
- h5rdly
- hadaev8AMAI (AI Interaction Corp)
- jbigoness
- kseniaryabinova
- lemonbuffer20
- lifeitengShanghai
- limapedro
- Luonic@mosmetro-android
- makdoudNPalaiseau, France
- melsoniusBurlington, MA
- mj-willUniversity of Portsmouth
- n9Mtq4Massachusetts
- nickfraser
- nwt-patrick
- originalsouth
- osascruz
- raijinspecialThe Milky Way
- shahbuland@SynthLabsAI
- StuartFarmerLamden
- vaisakh-mCochin, India
- vigsterkr
- weeoooweeooo
- WuShichaoMax Planck Institute for Gravitational Physics