/Deep_learning_digit_recognition_and_comparison

Siamese network and auxiliary loss: different architectures implemented using weight sharing and auxiliary loss to create a neural network which learns different tasks during the training process. Use of the following concepts: FCN, CNN, SGD, mini-batch, batch normalization, learning rate decay and regularization.

Primary LanguageJupyter NotebookMIT LicenseMIT

Deep_learning_digit_recognition_and_comparison

Project: https://apiquet.com/2019/04/14/deep-learning-digit-recognition-and-comparison/

The goal of this project was to understand neural networks. Understanding their parameters (learning rate, hidden layers, batch size, etc), understanding the main concepts (FNN, CNN, SGD, mini-batch, batch normalization, learning rate decay, regularization, etc). Then, implementing different architectures using weight sharing, auxiliary loss, etc.

Each architecture has been divided in different files. For instance, CNN_WS_AL means: convolutional neural network using weight sharing and auxiliary loss. CNN_WS_NoAL means CNN with weight sharing but without auxiliary loss, etc.