__Revamped version of my Intro. to Deep Learning Tutorial for the New Frontiers Initiative Webinar Series. More info at: https://bluewaters.ncsa.illinois.edu/NFI/Webinars/DeepLearning __
This tutorial covers the basics of Deep Learning with Convolutional Neural Nets. The tutorial is broken into four notebooks. The topics covered in each notebook are:
-
Intro.ipynb:
- Linear Regression as single layer, single neuron model to motivate the introduction of Neural Networks as Universal Approximators that are modeled as collections of neurons connected in an acyclic graph
- Convolutions and examples of simple image filters to motivate the construction of Convolutionlal Neural Networks.
- Loss/Error functions, Gradient Decent, Backpropagation, etc
-
Mnist.ipynb:
- Visualizing Data
- Constructing simple Convolutional Neural Networks
- Training and Inference
- Visualizing/Interpreting trained Neural Nets
-
CIFAR-10.ipynb:
- Data Generators
- Overfitting
- Data Augmentation
-
Image_Segmentation.ipynb
- Semantic Segmentation
- UNet
References:
The code examples presented here are mostly taken (verbatim) or inspired from the following sources. I made this curation to give a quick exposure to very basic but essential ideas/practices in deep learning to get you started fairly quickly, but I recommend going to some or all of the actual sources for an in depth survey: