/mlp

A multi layer perceptron network with back-propogation and varying activations/cost-functions written from scratch in Python3

Primary LanguageJupyter Notebook

MLP - My Little Pony or Multi-Layer-Perceptron?

A multi layer perceptron network written from scratch in Python with statistical explorations of differing activations and cost functions. Hyper-parameter tuning from scratch was also explored, such as learning-rate decay.

This home made network was first tested on the classic XOR-gate problem, then on a SIN curve and finally on the letter recognition data also included in this repo.

The notebook is quite long so a html version has also been provided for quicker loading.

Written in Python 3.8.5