/Bottle_Neck_Effect_using_MNIST

A small experiment with convolutional neural network in keras.

Primary LanguageJupyter Notebook

This jupyter notebook shows convolutional neural network of using simplest architects with only single dense layer to network with 2 convolutional layers alongwith max pooling. We observe the effects with various architects on the accuracies on train and test set of well-known MNIST dataset.

It also shows the so-called bottleneck effect when the number of filters chosen in the second convolutional layer are less than first layer. The backpropagation shuts off many filters in first layer in this case as the second layer acts as bottle neck. This is also major reason that commonly used architects like VGG, MobileNets have the features that the number of filters in convolutional layer increase as move deeper into the network. This notebook tries to find a reason for such a choice. In Simonyan et. al. 2014, the authors comment as the reason for this is to prevent the loss of information.