Topics Course on Deep Learning for Spring 2016
by Joan Bruna, UC Berkeley, Statistics Department
##Syllabus
- Invariance, stability.
- Variability models (deformation model, stochastic model).
- Scattering
- Extensions
- Group Formalism
- Supervised Learning: classification.
- Properties of CNN representations: invertibility, stability, invariance.
- covariance/invariance: capsules and related models.
- Connections with other models: dictionary learning, LISTA, Random Forests.
- Other tasks: localization, regression.
- Embeddings (DrLim), inverse problems
- Extensions to non-euclidean domains.
- Dynamical systems: RNNs and optimal control.
- Guest Lecture: Wojciech Zaremba (OpenAI)
- Autoencoders (standard, denoising, contractive, etc.)
- Variational Autoencoders
- Adversarial Generative Networks
- Maximum Entropy Distributions
- Open Problems
- Guest Lecture: Ian Goodfellow (Google)
- Non-convex optimization theory for deep networks
- Stochastic Optimization
- Attention and Memory Models
- Guest Lecture: Yann Dauphin (Facebook AI Research)
-
Lec1 Jan 19: Intro and Logistics
-
Lec2 Jan 21: Representations for Recognition : stability, variability. Kernel approaches / Feature extraction. Properties.
recommended reading:
- Elements of Statistical Learning, chapt. 12, Hastie, Tibshirani, Friedman.
- Understanding Deep Convolutional Networks, S. Mallat.
-
Lec3 Jan 26: Groups, Invariants and Filters.
recommended reading
-
Lec4 Jan 28: Scattering Convolutional Networks.
recommended reading
further reading
-
Lec5 Feb 2: Further Scattering: Properties and Extensions.
recommended reading
-
Lec6 Feb 4: Convolutional Neural Networks: Geometry and first Properties.
recommended reading
- Deep Learning Y. LeCun, Bengio & Hinton.
- Understanding Deep Convolutional Networks, S. Mallat.
-
Lec7 Feb 9: Properties of learnt CNN representations: Covariance and Invariance, redundancy, invertibility
recommended reading
- Deep Neural Networks with Random Gaussian Weights: A universal Classification Strategy?, R. Giryes, G. Sapiro, A. Bronstein.
- Intriguing Properties of Neural Networks C. Szegedy et al.
- Geodesics of Learnt Representations O. Henaff & E. Simoncelli.
- Inverting Visual Representations with Convolutional Networks, A. Dosovitskiy, T. Brox.
- Visualizing and Understanding Convolutional Networks M. Zeiler, R. Fergus.
-
Lec8 Feb 11: Connections with other models (DL, Lista, Random Forests, CART)
recommended reading
-
Proximal Splitting Methods in Signal Processing Combettes & Pesquet.
-
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems Beck & Teboulle
-
Learning Fast Approximations of Sparse Coding K. Gregor & Y. LeCun
-
Task Driven Dictionary Learning J. Mairal, F. Bach, J. Ponce
-
Exploiting Generative Models in Discriminative Classifiers T. Jaakkola & D. Haussler
-
Improving the Fisher Kernel for Large-Scale Image Classification F. Perronnin et al.
-
NetVLAD R. Arandjelovic et al.
-
Lec9 Feb 16: Other high level tasks: localization, regression, embedding, inverse problems.
recommended reading
-
Object Detection with Discriminatively Trained Deformable Parts Model Felzenswalb, Girshick, McAllester and Ramanan, PAMI'10
-
Deformable Parts Models are Convolutional Neural Networks, Girshick, Iandola, Darrel and Malik, CVPR'15.
-
Rich Feature Hierarchies for accurate object detection and semantic segmentation Girshick, Donahue, Darrel and Malik, PAMI'14.
-
Graphical Models, message-passing algorithms and convex optimization M. Wainwright.
-
Conditional Random Fields as Recurrent Neural Networks Zheng et al, ICCV'15
-
Joint Training of a Convolutional Network and a Graphical Model for Human Pose Estimation Tompson, Jain, LeCun and Bregler, NIPS'14.
-
Lec10 Feb 18: Extensions to non-Euclidean domain. Representations of stationary processes. Properties.
recommended reading
-
Dimensionality Reduction by Learning an Invariant Mapping Hadsell, Chopra, LeCun,'06.
-
Deep Metric Learning via Lifted Structured Feature Embedding Oh Song, Xiang, Jegelka, Savarese,'15.
-
Spectral Networks and Locally Connected Networks on Graphs Bruna, Szlam, Zaremba, LeCun,'14.
-
Spatial Transformer Networks Jaderberg, Simonyan, Zisserman, Kavukcuoglu,'15.
-
Intermittent Process Analysis with Scattering Moments Bruna, Mallat, Bacry, Muzy,'14.
-
Lec11 Feb 23: Guest Lecture ( W. Zaremba, OpenAI ) Discrete Neural Turing Machines.
-
Lec12 Feb 25: Representations of Stationary Processes (contd). Sequential Data: Recurrent Neural Networks.
recommended reading
- Intermittent Process Analysis with Scattering Moments J.B., Mallat, Bacry and Muzy, Annals of Statistics,'13.
- A mathematical motivation for complex-valued convolutional networks Tygert et al., Neural Computation'16.
- Texture Synthesis Using Convolutional Neural Networks Gatys, Ecker, Betghe, NIPS'15.
- A Neural Algorithm of Artistic Style, Gatys, Ecker, Betghe, '15.
- Time Series Analysis and its Applications Shumway, Stoffer, Chapter 6.
- Deep Learning Goodfellow, Bengio, Courville,'16. Chapter 10.
-
Lec13 Mar 1: Recurrent Neural Networks (contd). Unsupervised Learning: autoencoders. Density estimation. Parzen estimators. Restricted Boltzmann Machines. Curse of dimensionality
-
Lec14 Mar 3: Variational Autoencoders
-
Lec15 Mar 8: Adversarial Generative Networks
-
Lec16 Mar 10: Maximum Entropy Distributions
-
Lec17 Mar 29: Self-supervised models (analogies, video prediction, text, word2vec).
-
Lec18 Mar 31: Guest Lecture ( I. Goodfellow, Google Brain )
-
Lec19 Apr 5: Non-convex Optimization: parameter redundancy, spin-glass, optimiality certificates. stability
-
Lec20 Apr 7: Tensor Decompositions
-
Lec21 Apr 12: Stochastic Optimization, Batch Normalization, Dropout
-
Lec22 Apr 14: Reasoning, Attention and Memory: New trends of the field and challenges. limits of sequential representations (need for attention and memory). modern enhancements (NTM, Memnets, Stack/RNNs, etc.)
-
Lec23 Apr 19: Guest Lecture (Y. Dauphin, Facebook AI Research)
-
Lec24-25: Oral Presentations