/Python-Neural-Network

a simple deep neural network I made in python. version 1.3

Primary LanguagePythonApache License 2.0Apache-2.0

Python Neural Network

A simple deep neural network I made in python with numpy on my quest to understand machine learning and neural networks.

Files

  • ai.py - testing the neural network
  • perceptron.py - the actual neural network
  • activations.py - different activation functions for the network

Installation

The only non-native module that this program uses is numpy, so to use this neural network class all you have to is make sure you have python installed and you have installed numpy with pip and then download the files and run from your preffered method of python execution.

The Network

The network is a simple deep neural network with 5 variable number of input nodes, hidden layers, hidden nodes in those layers, output nodes, and training epochs (batch size is coming soon, currently it is a batch size of 1). the network is object oriented, so you can bring it in to your program by importing the file NeuralNetwork.py and creating a variable for a neural network. The network takes in 5 variables for the constructor function. the number of inputs, layers, hiddens, outputs, and epochs, in that order.

Functions

learning rate

  • setLearningRate - sets the networks learning rate.
  • getLearningRate - returns the networks learning rate.
  • dec_learningRate - reduces the networks learning rate by a given amount, default is .00001.

epochs

  • setEpochs - sets the networks training epochs.
  • getEpochs - returns the networks training epochs.
  • inc_Epochs - increases the networks epochs by a given amount, default is 1000

training

  • fit - run the train function on the training data self.epochs number of times
  • train - uses backpropagation and stochastic gradient descent to train the network.

testing

  • test - takes in testing data and testing labels, runs the data through the network, compares to the label and gives you the networks accuracy for the training data.

getting prediction

  • process - uses feedforward to guess the output of an input.
  • process_all - takes in a collection of unknown data and outputs the guess for each.

Training

Training the network involves creating a for loop over the networks epochs, creating an list with a random entry from your training set and another list with the cooresponding label inside the loop, and using the .train function with the data and the label. or simply calling .fit on your training data.

  • train function
    epochs Training
  • fit function
    fit

Testing/Predicting

To test the network, simply use .test on you testdata and testlabels, To use it for predictions, either use the .process function on a single piece of unknown data or .process_all on a bunch of unknowns. guessing

The current test program

Currently the network is being tested on XOR. It outputs every epoch for the training cycle and then when it is finished training it displays the networks guess and graphs the error. you can see the network run on MNIST data here guesses
error
The program no longer graphs the error.

To Do List

  • fix mse and rmse - currently they are my best guess as to how those function work, but they could very well be wrong (see #2)
  • add functionality to save a model
  • add other loss functions like cross entropy
  • refactor the code
  • add more documentation with commments inside the code
  • add functionality for easy switching between different activation functions
  • add more activation functions
  • add functionality for mini batch training
  • run this network on MNIST or similiar dataset here

Credits and links to learn more