/DL4NLP

Deep Learning for NLP resources

Deep Learning for NLP resources

Introductory and state of the art resources for NLP sequence modeling tasks like dialog.

##Machine Learning: Neural Networks, RNN, LSTM Coursera: Machine Learning
Andrew Ng
Introductory course for linear regression, logistic regression, and neural networks.
Also covers support vector machines, k-means, etc.

Cousera: Neural Networks
Geoffrey Hinton
Covers a variety of topics: Neural nets, RNNs, LSTMs.

Deep Learning (Book)
Yoshua Bengio
Advanced book about deep learning.

A few useful things to know about machine learning
Pedro Domingos

Understanding LSTM Networks
Blog post by Chris Olah.

Deep Learning for NLP

Stanford CS 224D: Deep Learning for NLP class
Richard Socher. (2015) Class with videos, and slides.

A Primer on Neural Network Models for Natural Language Processing
Yoav Goldberg. October 2015. No new info, 75 page summary of state of the art.

Natural Language Processing (Almost) from Scratch
Collobert, et al. 2011 - Uses neural nets for POS tagging, chunking, NER.

Word Vectors

Resources about word vectors, aka word embeddings, and distributed representations for words.
Word vectors are numeric representations of words that are often used as input to deep learning systems. This process is sometimes called pretraining.

Efficient Estimation of Word Representations in Vector Space
[Distributed Representations of Words and Phrases and their Compositionality] (http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf)
Mikolov et al. 2013.
Generate word and phrase vectors. Performs well on word similarity and analogy task and includes Word2Vec source code Subsamples frequent words. (i.e. frequent words like "the" are skipped periodically to speed things up and improve vector for less frequently used words)

Deep Learning, NLP, and Representations
Chris Olah (2014) Blog post explaining word2vec.

GloVe: Global vectors for word representation
Pennington, Socher, Manning. 2014. Creates word vectors and relates word2vec to matrix factorizations. Evalutaion section led to controversy by Yoav Goldberg
Glove source code and training data

Thought Vectors

Thought vectors are numeric representations for sentences, paragraphs, and documents. The following papers are listed in order of date published, each one replaces the last as the state of the art in sentiment analysis. Paragraph vectors are unique in that they don't use a parse tree.

Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
Socher et al. 2013. Introduces Recursive Neural Tensor Network. Uses a parse tree.

Distributed Representations of Sentences and Documents
Le, Mikolov. 2014. Introduces Paragraph Vector. Concatenates and averages pretrained, fixed word vectors to create vectors for sentences, paragraphs and documents. Also known as paragraph2vec. Doesn't use a parse tree.
Implemented in gensim. See doc2vec tutorial

Deep Recursive Neural Networks for Compositionality in Language
Irsoy & Cardie. 2014. Uses Deep Recursive Neural Networks. Uses a parse tree.

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
Tai et al. 2015 Introduces Tree LSTM. Uses a parse tree.

##Dialog A Neural Network Approach toContext-Sensitive Generation of Conversational Responses
Sordoni 2015. Generates responses to tweets. Uses Recurrent Neural Network Language Model (RLM) architecture of (Mikolov et al., 2010).

Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks
Weston 2015. Classifies QA tasks. Expands on Memory Networks.

A Neural Conversation Model
Vinyals, Le 2015. Uses LSTM RNNs to generate conversational responses. Uses seq2seq framework.

Advanced Memory Architectures

Neural Turing Machines
Graves et al. 2014.

Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets
Joulin, Mikolov 2015. Stack RNN source code