/mlp-from-scratch

I implement a perceptron and a multi-layer perceptron (MLP) from scratch (with only numpy) and test their performances on tasks

Primary LanguageJupyter Notebook

MLP from scratch

I implement a perceptron and a multi-layer perceptron (MLP) from scratch (with only numpy) and test their performances on tasks

A single perceptron on a binary task

A single perceptron with trainable weights and bias and tested it against a binary classification task using the digits 0 and 1 of the MNIST dataset.

  • Explored the effect of activation function on learning
  • Explored the effect of learning rate on the speed of learning

An MLP on a binary task

Explore how the performance is affected by varying the number of neurons in the hidden layer and tested the model against a binary classification task.

  • Varied the number of hidden neurons and visualized the effect of the number of neurons on the hidden layers
  • Explored the effect of different activation functions on learning.