/awesome-nlp

:book: A curated list of resources dedicated to Natural Language Processing

awesome-nlp

A curated list of resources dedicated to Natural Language Processing

Maintainers - Keon Kim, Martin Park

Contributing

Please feel free to pull requests, email Martin Park (sp3005@nyu.edu)/Keon Kim (keon.kim@nyu.edu) to add links.

Table of Contents

Tutorials and Courses

  • Tensor Flow Tutorial on Seq2Seq Models
  • Natural Language Understanding with Distributed Representation Lecture Note by Cho

videos

Deep Learning for NLP

Stanford Natural Language Processing
Intro NLP course with videos. This has no deep learning. But it is a good primer for traditional nlp.

Stanford CS 224D: Deep Learning for NLP class
Richard Socher. (2015) Class with videos, and slides.

A Primer on Neural Network Models for Natural Language Processing
Yoav Goldberg. October 2015. No new info, 75 page summary of state of the art.

Codes

Implementations

Libraries

Services or APIs

  • Wit-ai - Natural Language Interface for apps and devices.

Articles

Review Articles

Word Vectors (part of it from DL4NLP)

Resources about word vectors, aka word embeddings, and distributed representations for words.
Word vectors are numeric representations of words that are often used as input to deep learning systems. This process is sometimes called pretraining.

Efficient Estimation of Word Representations in Vector Space
[Distributed Representations of Words and Phrases and their Compositionality] (http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf)
Mikolov et al. 2013.
Generate word and phrase vectors. Performs well on word similarity and analogy task and includes Word2Vec source code Subsamples frequent words. (i.e. frequent words like "the" are skipped periodically to speed things up and improve vector for less frequently used words)
Word2Vec tutorial in TensorFlow

Deep Learning, NLP, and Representations
Chris Olah (2014) Blog post explaining word2vec.

GloVe: Global vectors for word representation
Pennington, Socher, Manning. 2014. Creates word vectors and relates word2vec to matrix factorizations. Evalutaion section led to controversy by Yoav Goldberg
Glove source code and training data

Thought Vectors (from DL4NLP)

Thought vectors are numeric representations for sentences, paragraphs, and documents. The following papers are listed in order of date published, each one replaces the last as the state of the art in sentiment analysis.

Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
Socher et al. 2013. Introduces Recursive Neural Tensor Network. Uses a parse tree.

Distributed Representations of Sentences and Documents
Le, Mikolov. 2014. Introduces Paragraph Vector. Concatenates and averages pretrained, fixed word vectors to create vectors for sentences, paragraphs and documents. Also known as paragraph2vec. Doesn't use a parse tree.
Implemented in gensim. See doc2vec tutorial

Deep Recursive Neural Networks for Compositionality in Language
Irsoy & Cardie. 2014. Uses Deep Recursive Neural Networks. Uses a parse tree.

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
Tai et al. 2015 Introduces Tree LSTM. Uses a parse tree.

Semi-supervised Sequence Learning
Dai, Le 2015 "With pretraining, we are able to train long short term memory recurrent networks up to a few hundred timesteps, thereby achieving strong performance in many text classification tasks, such as IMDB, DBpedia and 20 Newsgroups."

Machine Translation

Neural Machine Translation by jointly learning to align and translate Bahdanau, Cho 2014. "comparable to the existing state-of-the-art phrase-based system on the task of English-to-French translation." Implements attention mechanism.
English to French Demo

Sequence to Sequence Learning with Neural Networks
Sutskever, Vinyals, Le 2014. (nips presentation). Uses LSTM RNNs to generate translations. " Our main result is that on an English to French translation task from the WMT’14 dataset, the translations produced by the LSTM achieve a BLEU score of 34.8"
seq2seq tutorial in

Single Exchange Dialogs (from DL4NLP)

A Neural Network Approach toContext-Sensitive Generation of Conversational Responses
Sordoni 2015. Generates responses to tweets.
Uses Recurrent Neural Network Language Model (RLM) architecture of (Mikolov et al., 2010). source code: RNNLM Toolkit

Neural Responding Machine for Short-Text Conversation
Shang et al. 2015 Uses Neural Responding Machine. Trained on Weibo dataset. Achieves one round conversations with 75% appropriate responses.

A Neural Conversation Model
Vinyals, Le 2015. Uses LSTM RNNs to generate conversational responses. Uses seq2seq framework. Seq2Seq was originally designed for machine transation and it "translates" a single sentence, up to around 79 words, to a single sentence response, and has no memory of previous dialog exchanges. Used in Google Smart Reply feature for Inbox

Memory and Attention Models (from DL4NLP)

Reasoning, Attention and Memory RAM workshop at NIPS 2015. slides included

Memory Networks Weston et. al 2014, and End-To-End Memory Networks Sukhbaatar et. al 2015.
Memory networks are implemented in MemNN. Attempts to solve task of reason attention and memory.
Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks
Weston 2015. Classifies QA tasks like single factoid, yes/no etc. Extends memory networks.
Evaluating prerequisite qualities for learning end to end dialog systems
Dodge et. al 2015. Tests Memory Networks on 4 tasks including reddit dialog task.
See Jason Weston lecture on MemNN

Neural Turing Machines
Graves et al. 2014.

Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets
Joulin, Mikolov 2015. Stack RNN source code and blog post

General Natural Language Processing

Named Entity Recognition

Neural Network

Supplementary Materials

Blogs

Credits

part of the lists are from