/D2L-Exercises

Solutions to the exercises in Dive into Deep Learning, in PyTorch

Primary LanguageJupyter Notebook

D2L-Exercises

Solutions to the exercises in Dive into Deep Learning, in PyTorch

Work largely in progress

If you think a solution is incorrect, please open an issue.

Status

  • 2: Preliminaries
    • 2.1: Data Manipulation
    • 2.2: Data Preprocessing
    • 2.3: Linear Algebra
    • 2.4: Calculus
    • 2.5: Automatic Differentiation
    • 2.6: Probability
  • 3: Linear Neural Networks
    • 3.1: Linear Regression
    • 3.2: Linear Regression Implementation from Scratch
    • 3.3: Concise Implementation of Linear Regression
    • 3.4: Softmax Regression
    • 3.5: The Image Classification Dataset
    • 3.6: Implementation of Softmax Regression from Scratch
    • 3.7: Concise Implementation of Softmax Regression
  • 4: Multilayer Perceptrons
    • 4.1: Multilayer Perceptrons
    • 4.2: Implementation of Multilayer Perceptrons from Scratch
    • 4.3: Concise Implementation of Multilayer Perceptrons
    • 4.4: Model Selection, Underfitting, and Overfitting
    • 4.5: Weight Decay
    • 4.6: Dropout
    • 4.7: Forward Propagation, Backward Propagation, and Computational Graphs
    • 4.8: Numerical Stability and Initialization
    • 4.9: Environment and Distribution Shift
    • 4.10: Predicting House Prices on Kaggle
  • 5: Deep Learning Computation
  • 6: Convolutional Neural Networks
  • 7: Modern Convolutional Neural Networks
  • 8: Recurrent Neural Networks
  • 9: Modern Recurrent Neural Networks
  • 10: Attention Mechanisms
  • 11: Optimization Algorithms
  • 12: Computational Performance
  • 13: Computer Vision
  • 14: Natural Language Processing: Pretraining
  • 15: Natural Language Processing: Applications
  • 16: Recommender Systems
  • 17: Generative Adversarial Networks