Basic neural network implementation.
Including:
- ReLU (forward and backward)
- Sigmoid (forward and backward)
- L2 regularization
- Dropout
- Stochastic gradient descent
- Update parameters with Momentum
- Update parameters with Adam
- Batch norm (Currently, working on it)
MIT