- Hebbian Learning Rule : It identifies, how to modify the weights of nodes of a network.
- Perceptron Learning Rule: Network starts its learning by assigning a random value to each weight.
- Delta Learning Rule: Modification in sympatric weight of a node is equal to the multiplication of error and the input.
- Correlation Learning Rule: The correlation rule is the supervised learning.
- Out-Star Learning Rule: We can use it when it assumes that nodes or neurons in a network arranged in a layer.
Aniket-Mishra/Neural-Networks
The basic learning rules in Neural Networks and a simple example code of how they work.
Python