/DL_Topics

List of DL topics and resources essential for cracking interviews

Deep Learning Interview Topics

This repo contains a list of topics which we feel that one should be comfortable with before appearing for a DL interview. This list is by no means exhaustive (as the field is very wide and ever growing).

Mathematics

  1. Linear Algebra(notes)
    • Linear Dependence and Span
    • Eigendecomposition
      • Eigenvalues and Eigenvectors
    • Singular Value Decomposition
  2. Probability and Statistics
    • Expectation, Variance and Co-variance
    • Distributions
    • Bias and Variance
      • Bias Variance Trade-off
    • Estimators
      • Biased and Unbiased
    • Maximum Likelihood Estimation
    • Maximum A Posteriori (MAP) Estimation
  3. Information Theory
    • (Shannon) Entropy
    • Cross Entropy
    • KL Divergence
      • Not a distance metric
      • Derivation from likelihood ratio (Blog)
      • Always greater than 0
        • Proof by Jensen's Inequality
      • Relation with Entropy (Explanation)

Basics

  1. Backpropogation
    • Vanilla (blog)
    • Backprop in CNNs
      • Gradients in Convolution and Deconvolution Layers
    • Backprop through time
  2. Loss Functions
    • MSE Loss
      • Derivation by MLE and MAP
    • Cross Entropy Loss
      • Binary Cross Entropy
      • Categorical Cross Entropy
  3. Activation Functions (Sigmoid, Tanh, ReLU and variants) (blog)
  4. Optimizers
  5. Regularization
    • Early Stopping
    • Noise Injection
    • Dataset Augmentation
    • Ensembling
    • Parameter Norm Penalties
      • L1 (sparsity)
      • L2 (smaller parameter values)
    • BatchNorm (Paper)
      • Internal Covariate Shift
      • BatchNorm in CNNs (Link)
      • Backprop through BatchNorm Layer (Explanation)
    • Dropout (Paper) (Notes)

Computer Vision

  1. ILSVRC
    • AlexNet
    • ZFNet
    • VGGNet (Notes)
    • InceptionNet (Notes)
    • ResNet (Notes)
    • DenseNet
    • SENet
  2. Object Recognition (Blog)
    • RCNN (Notes)
    • Fast RCNN
    • Faster RCNN (Notes)
    • Mask RCNN
    • YOLO v3 (Real-time object recognition)
  3. Convolution
    • Cross-correlation
    • Pooling (Average, Max Pool)
    • Strides and Padding
    • Output volume dimension calculation
    • Deconvolution (Transpose Conv.), Upsampling, Reverse Pooling (Visualization)

Natural Language Processing

  1. Recurrent Neural Networks
    • Architectures (Limitations and inspiration behind every model) (Blog 1) (Blog 2)
      • Vanilla
      • GRU
      • LSTM
      • Bidirectional
    • Vanishing and Exploding Gradients
  2. Word Embeddings
    • Word2Vec
    • CBOW
    • Glove
    • FastText
    • SkipGram, NGram
    • ELMO
    • OpenAI GPT
    • BERT (Blog)
  3. Transformers (Paper) (Code) (Blog)
    • BERT (Paper)
    • Universal Sentence Encoder

Generative Models

  1. Generative Adversarial Networks (GANs)
    • Basic Idea
    • Variants
      • Vanilla GAN (Paper)
      • DCGAN
      • Wasserstein GAN (Paper)
      • Conditional GAN (Paper)
    • Mode Collapse
    • GAN Hacks (Link)
  2. Variational Autoencoders (VAEs)
    • Variational Inference (tutorial paper)
    • ELBO and Loss Function derivation
  3. Normalizing Flows

Misc

  1. Triplet Loss
  2. BLEU Score
  3. Maxout Networks
  4. Support Vector Machines
    • Maximal-Margin Classifier
    • Kernel Trick
  5. PCA (Explanation)
    • PCA using neural network
      • Architecture
      • Loss Function
  6. Spatial Transformer Networks
  7. Gaussian Mixture Models (GMMs)
  8. Expectation Maximization

More Resources

  1. Stanford's CS231n Lecture Notes
  2. Deep Learning Book (Goodfellow et. al.)

Contributing

We welcome contributions to add resources such as notes, blogs, or papers for a topic. Feel free to open a pull request for the same!