Neural Network Labs
In neural network we basically train and test different data. In this repository I just have implemented some very basic training methods which we commonly use at the very beginning of learning neural network. Here I implemented
-
Simulation with Concurrent Inputs in a Static Network
-
Simulation with Sequential Inputs in a Dynamic Network
-
Simulation with Concurrent Inputs in a Dynamic Network
-
Incremental Training of Static Networks
-
Incremental Training with Dynamic Networks
-
Batch Training with Static Networks
-
Batch Training with Dynamic Networks
-
Single Neuron incremental training
-
Single Neuron batch training
-
XOR with multi-layer perceptrons
-
Hebbs Learning Hypothesis
-
Covariance Hypothesis
Installation
Use the package manager pip to install numpy.
pip install numpy
Usage
All the inputs and outputs are followed by the "Neural Network ToolboxTM User's Guide" by Mark Hudson Beale Martin T. Hagan Howard B. Demuth
For the XOR follow the inputs which is given in the code's comment section
Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Please make sure to update tests as appropriate.