/Neural-Networks

The basic learning rules in Neural Networks and a simple example code of how they work.

Primary LanguagePython

Neural Networks

Neural-Network-Learning-Rules

The basic learning rules in Neural Networks

  1. Hebbian Learning Rule : It identifies, how to modify the weights of nodes of a network.
  2. Perceptron Learning Rule: Network starts its learning by assigning a random value to each weight.
  3. Delta Learning Rule: Modification in sympatric weight of a node is equal to the multiplication of error and the input.
  4. Correlation Learning Rule: The correlation rule is the supervised learning.
  5. Out-Star Learning Rule: We can use it when it assumes that nodes or neurons in a network arranged in a layer.

Making Neural Networks

ANNs (Keras)

CNNs (Keras)