- The Kaggle MNIST training data was downloaded from https://raw.githubusercontent.com/wehrley/Kaggle-Digit-Recognizer/master/train.csv"
- Link to training data on Kaggle: https://www.kaggle.com/c/digit-recognizer/data
- RunBuilder and RunManager classes in files below adapted from DeepLizard pytorch tutorial: https://deeplizard.com/learn/playlist/PLZbbT5o_s2xrfNyHZsM6ufI0iZENK9xgG
kaggle_mnist_feed_forward_no_hidden_no_cnn_sigmoid_2020_07_11.ipynb
- Fully connected feed forward neural network with no hidden layers and sigmoid activation that achieves validation accuracy 91.9%.
- Warning: experimenting with torch and changing the shapes of data may make this file more difficult to follow.
kaggle_mnist_feed_forward_one_hidden_no_cnn_2020_07_13.ipynb
- Fully connected feed forward neural network with one hidden layer and sigmoid activation that achieves 97.4% validation accuracy.
kaggle_mnist_fully_connected_no_hidden_tanh_v2.ipynb
- Fully connected feed foward neural network with no hidden layers and tanh activation that achieves 91.9% validation accuracy.
- CNN with one hidden layer that achieves 98.3% validation accuracy in about 25 epochs.
- CNN with best run 99.57% validation accuracy on MNIST and 99.528% test accuracy on Kaggle submission.
- Best run at very end of notebook.
- some other experiments included.