CS224n-winter19
Solutions for CS224n, winter, 2019.
Welcome to discuss problems appearing in assigments, please submit to issue.
Also take notes for the key point in lectures.
The solutions for assignment is written by Markdown in Assignments/written part.
- Course page: https://web.stanford.edu/class/cs224n
- Video page: https://www.youtube.com/watch?v=8rXD5-xhemo&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z
w1
reading
- note: Word Vectors I: Introduction, SVD and Word2Ve
- Word2Vec Tutorial - The Skip-Gram Model
practice
- coding: Assignment1
- Gensim
w2
reading
- note: Word Vectors II: GloVe, Evaluation and Trainin
- gradient-notes
- CS231n notes on backprop
- review-differential-calculus
- backprop_old
- CS231n notes on network architectures
practice
- coding: Assignment2
- writing: Assignment2
w3
reading
- note: Dependency Parsing
- note: Language Models and Recurrent Neural Network
- a3
practice
- coding: Assignment3
- writing: Assignment3
w4
reading
- note: Machine Translation, Sequence-to-sequence and Attention
- a4
- read: Attention and Augmented Recurrent Neural Networks
- read: Massive Exploration of Neural Machine Translation Architectures (practical advice for hyperparameter choices)
practice
- coding: Assignment4
- writing: Assignment4
key point for a4
How to understand pack_padded_sequence and pad_packed_sequence?
(Chinese ed)
(English ed)
w5
It has been long time for no updating...
reading
- note: Machine Translation, Sequence-to-sequence and Attention
- a4
- read: Attention and Augmented Recurrent Neural Networks
practice
- coding: Assignment5
- writing: Assignment5
Final project
reading:
- final-project-practical-tips
- default-final-project-handout
- project-proposal-instructions
- Practical Methodology_Deep Learning book chapter
- Highway Networks
- Bidirectional Attention Flow for Machine Comprehension
practice:
- anotate codes
- train baseline