Base Repository cloned from https://github.com/rupak-118/AI-papers
This repository contains a list of papers as discussed on my blog series (23 Deep Learning Papers To Get You Started) on Medium.
- Part 1 - https://medium.com/@rupak.thakur/23-deep-learning-papers-to-get-you-started-part-1-308f80d7bba2
- Part 2
It also contains additional research papers which I keep finding useful along my DL journey.
-
[TO READ] - A Few Useful Things to know about Machine Learning - Pedro Domingos : A very useful paper summarizing all the key learnings from an ML standpoint
-
Introduction to CNNs : An article, pretty different from a conventional paper, but provides a comprehensive explanation through mathematical concepts and derivations of basic CNN elements
-
Visualizing and Understanding CNNs - Zeiler and Fergus : Landmark paper in the history of deep learning. This opened up the doors to building more interpretable and easily explained deep learning models
-
Xavier initialization - Glorot, Bengio : Talks about the importance of appropriate weight initialization in neural networks
-
Delving Deep into Rectifiers - Surpassing Human Level Performance on ImageNet Classification : Introduces PReLU activation function and He initialization (another weight initialization scheme)
-
BatchNormalization : Talks about internal covariate shift and accelerating the training process of a deep neural network by reducing it
-
Overview of Gradient Descent Optimization Algorithms : Explains the math and intuition behind different optimization algorithms coming under the Gradient Descent umbrella
-
Dropout - Hinton et al. : Introduces the hugely popular regularization technique, Dropout. Dropout has been a key component in most neural network architectures since, and has also been highly successful in competitions.