/Daedalus

Deep Learning Research

Primary LanguageJupyter Notebook

Deep Learning Research

R1: Normalizing Activation with Good Init and Activations

In this notebook we will explore different initialization methods along with activation function and see how each effect the activation output with $N$ layers.

Links to papers:

R2: Self Normalising Neural Networks

In this notebook we will explore Self-Normalising-Neural-Networks

Links to papers: