/Deep_Learning_Optimization

Compilation of notebooks dedicated to optimization processes in deep learning. Focus is on super-convergence, cyclical learning rates, lookahead, rectified Adam, and Adam methods.

Primary LanguageJupyter NotebookGNU General Public License v3.0GPL-3.0

Deep_Learning_Optimization

Workshop Material -- Consult slides, available in the repository

Additional Learning Resources

Review Optimization Methods in Deep Learning

https://www.slideshare.net/StefanKhn4/the-machinery-behind-deep-learning

Understand Super-Convergence

https://arxiv.org/pdf/1708.07120.pdf

Learn about Cyclical Learning Rates (-> Super-Convergence)

https://arxiv.org/abs/1506.01186

Get insights on Lookahead

https://arxiv.org/abs/1907.08610

Discover Rectified Adam

https://arxiv.org/pdf/1908.03265.pdf

Delve into Adam

https://arxiv.org/abs/1412.6980

This is a fork from an open-source project developed by another user. The project is now maintained by Skyworkin.