Implementation of a Neural Networks library from scratch, using only Python and Numpy. Inspired from old Lua versions of Torch, before the autograd
Made alone without group
Machine Learning project during first years of Master DAC at Sorbonne University
- activation.py : contains all activation functions (TanH, sigmoid, ReLU)
- loss.py : contains all loss functions (CrossEntropyLoss, BCELoss, MSELoss)
- convolution.py : contains function for 2d convolution
- mltools.py : contains plot functions to visualize result
- utils.py : contains some useful function (mini-batch, one-hot, load data)
- projet_etu.py : contains modules like Torch (Linear, Sequential, Optim, SGD) and also code for convolution 1d (should be in convolution.py though)