/MLP-Architetures-on-MNIST-dataset

Experimented with different architectures on MNIST dataset using MLPs with different dropouts.

Primary LanguageJupyter Notebook

MLP-Architetures-on-MNIST-dataset

Screenshot (71) 1] As dropout is increased from 0.2 to 0.5 with same batch size and architecture , number of epoch needed starts increasing
2] After reaching the particular epoch the model overfits.
3] When we dont apply BatchNormalization, no. of epoch becomes 4 after which the model starts overfitting.
4] For 5 Hidden Layers with dropout = 0.5 , epoch reached maximum which is 50.
5] All the test Accuracy is in the range of (98 - 98.5) %
6] For models without BatchNormalization we got less test accuracy of approx 97 %