Code is built using Theano library so, this code can be run either on CPU or GPU, set GPU to true to run on GPU and set GPU to false to run on CPU
This program incorparates ideas and code from text book on Neural Networks and Deep learning from Michael Nielsen and Michael Nielsen's github
MNISTLoadData.py: Unpacks data from the package and returns training,validation, and testing data
MNISTTraining_Theano.py: Implementation of neural network using theano giving an advantage of running the code either on CPU/GPU. In addition to that this code supports different cost functions and activation functions
MNIST_CrossEntropy.py: Implementation of neural network using cross entropy cost function, sigmoid activation function, and stochastic gradient descent
MNIST_QuadraticAndCrossEntropy.py: Implementation of neural network using quadratic cost function, sigmoid activation function, and stochastic gradient descent
import MNISTTraining
from MNISTTraining import Network
from MNISTTraining import ConvPoolLayer, FullyConnectedLayer, SoftmaxLayer
training_data, validation_data, test_data = MNISTTraining.load_data_shared()
mini_batch_size = 10
net = Network([
ConvPoolLayer(image_shape=(mini_batch_size, 1, 32, 32), filter_shape=(20, 1, 5, 5),
poolsize=(2, 2)), FullyConnectedLayer(n_in=20*14*14, n_out=100),
SoftmaxLayer(n_in=100, n_out=10)], mini_batch_size)
net.SGD(training_data, 60, mini_batch_size, 0.1,validation_data, test_data)