Implement in tensorflow/ MNIST 0~4/ Easy Deep neural network
Needed packages
- tensorflow
- numpy
MNIST 0~4
- This repository is a practicing project on tensorflow
- Training on MNIST but only on digits 0 to 4
- softmax output layer with five neurons
Model architecture & Implement details
- only 5 hidden layers of 128 neurons.
- ELU activation function is used in each layers.
- Adam optimizer
Sub-functions
- we've simply write 3 types of functions for evaluation
- accuracy function
- mAP function
- recall function `