Solutions for CS224n, winter, 2019.
Welcome to discuss problems appearing in assigments, please submit to issue.
Also take notes for the key point in lectures.
The solutions for assignment is written by Markdown in Assignments/written part.
- Course page: https://web.stanford.edu/class/cs224n
- Video page: https://www.youtube.com/watch?v=8rXD5-xhemo&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z
- note: Word Vectors I: Introduction, SVD and Word2Ve
- Word2Vec Tutorial - The Skip-Gram Model
- coding: Assignment1
- Gensim
- note: Word Vectors II: GloVe, Evaluation and Trainin
- gradient-notes
- CS231n notes on backprop
- review-differential-calculus
- backprop_old
- CS231n notes on network architectures
- coding: Assignment2
- writing: Assignment2
- note: Dependency Parsing
- note: Language Models and Recurrent Neural Network
- a3
- coding: Assignment3
- writing: Assignment3
- note: Machine Translation, Sequence-to-sequence and Attention
- a4
- read: Attention and Augmented Recurrent Neural Networks
- read: Massive Exploration of Neural Machine Translation Architectures (practical advice for hyperparameter choices)
- coding: Assignment4
- writing: Assignment4
How to understand pack_padded_sequence and pad_packed_sequence?
(Chinese ed)
(English ed)
It has been long time for no updating...
- note: Machine Translation, Sequence-to-sequence and Attention
- a4
- read: Attention and Augmented Recurrent Neural Networks
- coding: Assignment5
- writing: Assignment5
reading:
- final-project-practical-tips
- default-final-project-handout
- project-proposal-instructions
- Practical Methodology_Deep Learning book chapter
- Highway Networks
- Bidirectional Attention Flow for Machine Comprehension
practice:
- anotate codes
- train baseline