This repository is not active
Manav54321/DL_31_Batch_Normalization
How Batch Normalization transforms the internal workings of neural networks by normalizing inputs within each mini-batch. By maintaining stable activations throughout the training process, Batch Normalization improves convergence speed and aids in tackling the vanishing/exploding gradient problem.
Jupyter NotebookApache-2.0