Neural-Net-Scratch

A simple implementation of famous neural network algorithms from scratch without use of any external libraries for classyfying handwritten images of numbers

Implemented and tested ReLu and TanH activation functions, simple gradient descent & batch gradient decent with both constant and decaying learning rate.