I implement a perceptron and a multi-layer perceptron (MLP) from scratch (with only numpy) and test their performances on tasks
A single perceptron with trainable weights and bias and tested it against a binary classification task using the digits 0 and 1 of the MNIST dataset.
- Explored the effect of activation function on learning
- Explored the effect of learning rate on the speed of learning
Explore how the performance is affected by varying the number of neurons in the hidden layer and tested the model against a binary classification task.
- Varied the number of hidden neurons and visualized the effect of the number of neurons on the hidden layers
- Explored the effect of different activation functions on learning.