/Stanford_CS231N

Stanford CS231N Course: Assignment Implementation, Course Notes and Paper References

Primary LanguageJupyter Notebook

Stanford_CS231N

The goal of setting up this repo is to make full use of Stanford CS 231N course.

This repo mainly provides the following features:

  1. For review purpose : A more convenient visualization of jupyter notebooks without setting up notebook server locally.
  2. The references of papers which appear in the 5 courses as well as some notes about the papers
  3. Nicely commented code from helper functions to project architecture as well as a guideline of how to go through them
  4. Extend the project to end to end system: from data labeling to research diary

Recourse collection contributors: Michael Wang

Lecture 2 - Image Classification Pipeline:

[1] "Deep Visual-Semantic Alignments for Generating Image Descriptions"[pdf]

Lecture 3 - Loss Functions and Optimization:

[1] "Object Recognition from Local Scale-Invariant Features"[pdf]

[2] "Histograms of Oriented Gradients for Human Detection"[pdf]

[3] "A Bayesian Hierarchical Model for Learning Natural Scene Categories"[pdf]

Lecture 4 - Backpropagation and Neural Networks:

[1] "Object Recognition from Local Scale-Invariant Features"[pdf]

[2] "Histograms of Oriented Gradients for Human Detection"[pdf]

Lecture 5 - Convolutional Neural Networks:

[1] "ImageNet Classification with Deep Convolutional Neural Networks"[pdf]

[2] "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks"[pdf]

[3] "DeepFace: Closing the Gap to Human-Level Performance in Face Verification"[pdf]

[4] "Deep Inside Convolutional Networks: Visualising Image Classification Models and Saliency Maps"[pdf]

[5] "Two-Stream Convolutional Networks for Action Recognition in Videos"[pdf]

[6] "DeepPose: Human Pose Estimation via Deep Neural Networks"[pdf]

[7] "Deep Learning for Real-Time Atari Game Play Using Offline Monte-Carlo Tree Search Planning"[pdf]

[8] "Breast Mass Classification from Mammograms using Deep Convolutional Neural Networks"[pdf]

[9] "Rotation-invariant convolutional neural networks for galaxy morphology prediction"[pdf]

[10] "Traffic Sign Recognition with Multi-Scale Convolutional Networks"[pdf]

[11] "Image Style Transfer Using Convolutional Neural Networks"[pdf]

[12] "Controlling Perceptual Factors in Neural Style Transfer"[pdf]

Lecture 6 - Training Neural Networks I:

[1] Papers on weight initializations are on the lecture slide.

[2] "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift"[pdf]

Lecture 7 - Training Neural Networks II:

[1] "On the importance of initialization and momentum in deep learning"[pdf]

[2] "Adaptive Subgradient Methods for Online Learning and Stochastic Optimization"[pdf]

[3] "ADAM: A METHOD FOR STOCHASTIC OPTIMIZATION"[pdf]

[4] "On Optimization Methods for Deep Learning"[pdf]

[5] "SGDR: STOCHASTIC GRADIENT DESCENT WITH WARM RESTARTS"[pdf]

[6] "SNAPSHOT ENSEMBLES: TRAIN 1, GET M FOR FREE"[pdf]

[7] "ACCELERATION OF STOCHASTIC APPROXIMATION BY AVERAGING*"[pdf]

[8] "Regularization of Neural Networks using DropConnect*"[pdf]

[9] "Deep Networks with Stochastic Depth"[pdf]

[10] "DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition"[pdf]

[11] "DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition"[pdf]

[12] "Fast RCNN"[pdf]

Lecture 8 - Deep Learning Software:

[1] "DEEP LEARNING WITH DYNAMIC COMPUTATION GRAPHS"[pdf]

[2] "Deep Compositional Question Answering with Neural Module Networks"[pdf]

[3] "Learning to Compose Neural Networks for Question Answering"[pdf]

Lecture 9 - CNN Architectures:

[1] "ImageNet Classification with Deep Convolutional Neural Networks(AlexNet)"[pdf]

[2] "Visualizing and Understanding Convolutional Networks"[pdf]

[3] "VERY DEEP CONVOLUTIONAL NETWORKS FOR LARGE-SCALE IMAGE RECOGNITION (VGG)"[pdf]

[4] "Going deeper with convolutions (Google Net, Inception Net)"[pdf]

[5] "Deep Residual Learning for Image Recognition (ResNet)"[pdf]

[6] "Network in Network (NiN)"[pdf]

[7] "Wide Residual Networks"[pdf]

[8] "Aggregated Residual Transformations for Deep Neural Networks (ResNext)"[pdf]

[9] "Wide Residual Networks"[pdf]

[10] "Deep Networks with Stochastic Depth"[pdf]

[11] "FRACTALNET: ULTRA-DEEP NEURAL NETWORKS WITHOUT RESIDUALS"[pdf]

[12] "Densely Connected Convolutional Networks"[pdf]

[13] "SQUEEZENET: ALEXNET-LEVEL ACCURACY WITH 50X FEWER PARAMETERS AND <0.5MB MODEL SIZE"[pdf]