CNN from Scratch

Implementing CNN using Python and Numpy following this blog.

Pointers:


The goal of Xavier Initialization is to initialize the weights such that the variance of the activations is the same across every layer. This constant variance helps prevent the gradient from exploding or vanishing. The Xavier Initialization is a method used to initialize the weights of a deep neural network. It works by randomly assigning each weight a value that follows a normal distribution, with a mean of 0 and a variance of 1 divided by the number of nodes in the previous layer