/DL-from-Scratch

Implementation of different types of Neural networks from scratch.

Primary LanguagePythonMIT LicenseMIT

Neural-Networks

Initialization

  • Weight initialization methods
    • Random initialization
    • Xavier initialization
    • He initialization

Activation functions and Loss functions

  • Activation functions and their derivatives
    • Identity
    • Sigmoid
    • Softmax
    • Tanh
    • ReLU
  • Loss functions and their derivatives
    • Mean Squared Error
    • Log-likelihood
    • Cross Entropy

Optimizers

  • Stochastic mini-batch Gradient Descent
  • Momentum based Gradient Descent
  • Nesterov accelerated Gradient Descent
  • ReadMe

Feedforward Neural Network .

  • fnn.py - Generic Feedforward Neural Network.
  • customdl package
  • ReadMe

Convolutional Neural Network

  • ...

To-do list

  • Use validation data for hyper parameter tuning
    • hyper paramters: epochs, mini-batch size, learning rate, momentum
  • Plots for monitoring loss and accuracy over epochs
    • With data as arg ( options: training_data, validation_data, test_data )
  • Regularization techniques: L1, L2, dropout
  • Add optimizers: Adam, RMSProp
  • CNN
  • RBF NN
To get started with Neural Networks I recommend the playlist by 3Blue1Brown.