/CS224n_Stanford_NLP

Deep Learning Based NLP

Primary LanguageJupyter Notebook

CS224n (NLP) -- Spring 2019

Deep Learning Based NLP

https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/

Syllabus

https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/index.html#schedule

Project: Question-Answering Using SQuAD 2.0 Dataset

https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/project/default-final-project-handout.pdf

https://github.com/amitp-ai/CS224n_Stanford_NLP/tree/main/Project_QA_SQuAD2

Experimented with:

  1. Bi-directional Attention Flow (BiDAF) with and without character level embeddings (in addition to word level embeddings)
  2. BiDAF with soft output labelling for lower variance
  3. Question-Answer Network (QANet)
  4. QANet with Stochastic Depth Dropout (inside the contextual embeddings layer)
  5. Bi-directional Embedding Representation for Transformers (BERT) based architecture for Question Answering tasks

Assignments

Assignment 1: Exploring word vectors

Assignment 2: Understanding and Implementing Word2Vec

Assignment 3: Neural Dependency Parser

Assignment 4: Neural Machine Translation

Assignment 5: More Advanced Neural Machine Translation