The aim of this course is to cover the following concepts:
- Deep neural Network architectures
- Convolutional neural network, reccurents neural network.
- Attention models.
- Computational graph and autograd.
- Framework for deeplearning: pytorch, tensor flow and CUDA programmation.
- Statistical learning theory, generalizability, bias-variance dilemma, PAC, learning complexity, etc.
- Supervised Learning: Classification, Neural Networks, Support Vector Machines, Kernel Methods, Gaussian Processes, etc...
- Optimization
- Unsupervised learning: Clustering, Matrix Factoring, Latent Variable Models (blends, etc.)
- Low supervision learning, Semi-supervised and transductive learning, Active learning,
- Transfer Learning
- Learning and structured data: Sequences and trees, Graphs and interdependent data.