Deep neural network written in Python with NumPy and trained on the MNIST dataset: http://yann.lecun.com/exdb/mnist/.
- Architecture: 3 hidden layers with 500, 300, and 100 neurons respectively
- Activation function: ReLU for hidden layers, softmax for output layer
- Optimization: Adam optimization algorithm, mini-batch size = 128
- Regularization: Inverted Dropout (keep_prob = 0.8)
- Features: Gradient checking
- Matthias Wright
This project is licensed under the MIT License - see the LICENSE.md file for details