/Neural-Network-Implementation

Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)

Primary LanguageJupyter NotebookMIT LicenseMIT

Neural Network Implementation

Description

Neural Network implemented with different Activation Functions, Optimizers, and Loss Functions.

Activation Functions

  • Sigmoid
  • Relu
  • Leaky-Relu
  • Softmax

Optimizers

  • Gradient Descent
  • AdaGrad
  • RMSProp
  • Adam

Loss Functions

  • Cross-Entropy Loss
  • Hinge-Loss
  • Mean Squared Error (MSE)

Contributors