- Create a customized Feedforward Neural Network
- Available options:
- Weight initialization: Random, Xavier, He
- Activation functions: Identity, Sigmoid, Softmax, Tanh, ReLU
- Loss functions: MSE, Cross Entropy
- Optimizers: GD, Momentum based GD, Nesterov accerelated GD
- Learning mode: online, mini-batch, batch
- Available options:
- Refer to the documentation of any class/method by using help(class/method) Eg: help(FNN), help(FNN.compile)
- For a high-level overview of the underlying theory refer:
$ [sudo] pip3 install customdl
$ git clone https://github.com/Taarak9/Custom-DL.git
>>> from customdl import FNN
import numpy as np
from matplotlib import pyplot as plt
from mnist_loader import load_data_wrapper
from customdl import FNN
# MNIST data split
training_data, validation_data, test_data = load_data_wrapper()
# Loss function: Cross Entropy
hdr = FNN(784, "ce")
hdr.add_layer(80, "sigmoid")
hdr.add_layer(10, "sigmoid")
hdr.compile()
hdr.fit(training_data, validation_data)
hdr.accuracy(test_data)
The mnist_loader used could be found here.
- Plots for monitoring loss and accuracy over epochs
- Regularization techniques: L1, L2, dropout
- Optimizers: Adam, RMSProp
- RBF NN