/CS224n-2019-solutions

Complete solutions for Stanford CS224n, winter, 2019

Primary LanguagePython

996.icu

CS224n-winter19

Solutions for CS224n, winter, 2019.
Welcome to discuss problems appearing in assigments, please submit to issue.
Also take notes for the key point in lectures. The solutions for assignment is written by Markdown in Assignments/written part.  


Update

update 2019/12/03

  After CS224n I realize that more systematical training is needed. So I start a new repo learn_NLP_again, here is the description(algorithms and solutions is available for chapter 1 until now):

  Here is why I started this project: learn NLP from scratch again. I choose Speech and language process as my entry point, and try to write solutions and implement some algorithms/models of this book. I hope I can stick to this project and update frequently.

  After one year's training in corporation and lab, I find many faults or incorrect habbits in past parctice, (btw, there is too many commits in this repo). I'll review the code in this repo and solve issues gradually.(:smile:, hopefully)

Welcome communications in new repo!

w1

reading

  • note: Word Vectors I: Introduction, SVD and Word2Ve
  • Word2Vec Tutorial - The Skip-Gram Model  

practice

  • coding: Assignment1
  • Gensim

w2

reading

  • note: Word Vectors II: GloVe, Evaluation and Trainin
  • gradient-notes
  • CS231n notes on backprop
  • review-differential-calculus
  • backprop_old
  • CS231n notes on network architectures

practice

  • coding: Assignment2
  • writing: Assignment2

w3

reading

  • note: Dependency Parsing
  • note: Language Models and Recurrent Neural Network
  • a3

practice

  • coding: Assignment3
  • writing: Assignment3

w4

reading

  • note: Machine Translation, Sequence-to-sequence and Attention
  • a4
  • read: Attention and Augmented Recurrent Neural Networks
  • read: Massive Exploration of Neural Machine Translation Architectures (practical advice for hyperparameter choices)

practice

  • coding: Assignment4
  • writing: Assignment4

key point for a4

How to understand pack_padded_sequence and pad_packed_sequence?
(Chinese ed)
(English ed)

w5

It has been long time for no updating...

reading

  • note: Machine Translation, Sequence-to-sequence and Attention
  • a4
  • read: Attention and Augmented Recurrent Neural Networks

practice

  • coding: Assignment5
  • writing: Assignment5

Final project

reading:

  • final-project-practical-tips
  • default-final-project-handout
  • project-proposal-instructions
  • Practical Methodology_Deep Learning book chapter
  • Highway Networks
  • Bidirectional Attention Flow for Machine Comprehension

practice:

  • anotate codes
  • train baseline