CS224n: Natural Language Processing with Deep Learning , Stanford / Winter 2019
Official website
Lecture 1: Introduction and Word Vectors
slides
video
official notes
Lecture 2: Word Vectors and Word Senses
slides
video
official notes
Lecture 3: Word Window Classification, Neural Networks, and Matrix Calculus
slides
video
official notes
Lecture 4: Backpropagation
slides
video
official notes
Lecture 5: Dependency Parsing
slides
video
official notes
Lecture 6: Language Models and RNNs
slides
video
official notes
Lecture 7: Vanishing Gradients, Fancy RNNs
slides
video
official notes
Lecture 8: Translation, Seq2Seq, Attention
slides
video
official notes
Lecture 9: Practical Tips for Projects
slides
video
official notes
Lecture 10: Question Answering
slides
video
official notes
Lecture 11: Convolutional Networks for NLP
slides
video
official notes
Lecture 12: Subword Models
slides
video
Lecture 13: Contextual Word Embeddings
slides
video
Lecture 14: Transformers and Self-Attention
slides
video
Lecture 15: Natural Language Generation
slides
video
Lecture 16: Coreference Resolution
slides
video
Lecture 17: Multitask Learning
slides
video
Lecture 18: Constituency Parsing, TreeRNNs
slides
video
official notes
slides
video
Lecture 20: Future of NLP + Deep Learning
slides
video
official file
my solution [Finished!]
official file
handout
written part:my answer [Finished!]
coding part:my solution [Finished!]
official file
handout
written part:my answer [Finished!]
coding part:my solution [Finished!]
official file
handout
coding part:my solution [Finished!]