/neural-network

🌐 Perceptron-based neural network using error-weighted derivative and Sigmoid normalization

Primary LanguagePython

Problem set

Below you can see four sets of inputs where we feed a training model three digits and declare the expected output. Based on this, we want to write a neural network that can predict the output of the 'New situation' input.

Previously known inputs and outputs

Input Output
Example 1 0 0 1 0
Example 2 1 1 1 1
Example 3 1 0 1 1
Example 4 0 1 1 0

New situation

Input Output
New input 1 0 0 ?

Perceptron neural network

This network won't have any hidden layers and will look like this:

The input values will in our case be either 0 or 1. Each synapse will be given a random weight. After passing through the neuron which will do a weighted sum of the inputs we'll put it through a normalizing function to get the output to either a 0 or a 1. For this, we'll use the Sigmoid normalizing function.

Training process

  1. Take the input from the training example and put them through the formula to get the neurons output
  2. Calculate the error, which is the difference between the output we got and the actual output
  3. Depending on the severity of the error, adjust the weights accordingly
  4. Repeat the process 100000 times

Error Weighted Derivative

We'll multiple the error, which is the difference between the expected output and the actual output, with the input - which is either a 0 or a 1. Then we take the gradient of the Sigmoid function at our output.