/L-Layer-Deep-NeuralNetwork

Built a Regularized L-layer Deep Neural Network from Scratch

Primary LanguageJupyter NotebookMIT LicenseMIT

L-Layer-Regularized-Deep-NeuralNetwork from Scratch

Using Numpy Library built a L-layered Deep Neural Network that works with any activation function for each layer along with Ridge Regularization.

activation is a argument for the forward_propagation() function, which is a list containing 1,2,3. If you want to have sigmoid (or tanh) activation for n-th layer then n-th index of the activation list should be inputed as 1 (or 2).