ResNet-CIFAR10

We will implement a ResNet from scratch and test it on the CIFAR-10 Dataset. The ResNet is a neural network architecture proposed as a solution to the learning degradation problem wherein adding more layers to a network increases the model's performance up to a certain point, but adding even more layers after that point result in significant performance degredation. ResNet addresses this problem by modifying the target mapping of the neural network. If the network wishes to converge to a mapping $H(x)$, ResNet instead seeks to model the function $F(x)=H(x)-x$ and then add $x$ back in later through the use of shortcut connections. Thus ResNet first models $F(x)$ then uses shortcut connections to output $F(x)+x = H(x)$. The main hypothesis is that the altered mapping $F(x)$ is easier to learn than the actual desired mapping $H(x)$. In this way, ResNet outperforms earlier networks and still remains relevant today as a powerful CNN architecture.