Neural Network
implemented with different Activation Functions
, Optimizers
, and Loss Functions
.
- Sigmoid
- Relu
- Leaky-Relu
- Softmax
- Gradient Descent
- AdaGrad
- RMSProp
- Adam
- Cross-Entropy Loss
- Hinge-Loss
- Mean Squared Error (MSE)
Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)
Jupyter NotebookMIT
Neural Network
implemented with different Activation Functions
, Optimizers
, and Loss Functions
.