A set of notebooks explore deep learning related topics.
-
WGAN paper study: replicate some results in the WGAN paper.
-
CNN architectures: look into the structures of common CNN architectures, such as ResNet, ResNeXt, SENet, Densenet, Inception V4, WRN, Xception, VGG, etc., and how to use them in fastai.
-
An easy way to do the backward propagation math: use a simple rule to derive the backward propagation for all different kinds of neural networks, such as LSTM, CNN, etc.
-
Resume interrupted 1cycle policy training: divide the long training process into smaller ones and resume the training.
-
How the LSTM's memory works?: dig into the LSTM's internal states to see how it manages to generate valid XML texts.
-
To be continued ...