Project: https://apiquet.com/2019/04/14/deep-learning-digit-recognition-and-comparison/
The goal of this project was to understand neural networks. Understanding their parameters (learning rate, hidden layers, batch size, etc), understanding the main concepts (FNN, CNN, SGD, mini-batch, batch normalization, learning rate decay, regularization, etc). Then, implementing different architectures using weight sharing, auxiliary loss, etc.
Each architecture has been divided in different files. For instance, CNN_WS_AL means: convolutional neural network using weight sharing and auxiliary loss. CNN_WS_NoAL means CNN with weight sharing but without auxiliary loss, etc.