List of resources to get started with Deep Learning for NLP. (Updated incrementally)
-
https://www.youtube.com/playlist?list=PL6Xpj9I5qXYEcOhn7TqghAJ6NAPrNmUBH : This lecture series has very good introduction to Neural Network and Deep Learning.
-
https://www.coursera.org/course/neuralnets : This lecture series is from Geof Hinton. The concepts explained are bit abstract, concepts are hard to understand in first go. Generally people recommend these lectures as starting point but I am skeptical about it. I would suggest going through 1st one before this.
-
https://www.youtube.com/playlist?list=PLE6Wd9FR--EfW8dtjAuPoTuPcqmOV53Fu : Deep Learning Lectures from Oxford University
-
https://www.iro.umontreal.ca/~lisa/pointeurs/TR1312.pdf : This is a short book on Deep Learning written by Yoshua Bengio. It deals with theoritical aspects related to Deep Architectures. Great book though.
-
http://www.deeplearningbook.org/ : This web page has a book draft written by Yoshua Bengio and Ian Goodfellow. Later person is author of Theano library. This is holy bible on Deep Learning.
-
http://cs231n.stanford.edu/ : Deep Learning for Vision by Stanford. Good lectures by Andrej Karpathy on introduction to DL (some initial lectures)
-
http://videolectures.net/yoshua_bengio/ : Video Lectures By Yoshua Bengio on Theoritical Aspects of Deep Learning. They are counterparts of resource [4].
-
http://videolectures.net/geoffrey_e_hinton/ : Video Lectures by the GodFather Geoffrey Hinton on introduction to Deep Learning and some advanced stuff too.
-
https://github.com/ChristosChristofidis/awesome-deep-learning : Good collection of resources.
-
http://deeplearning.net/reading-list/ : Reading resources
-
http://www.cs.toronto.edu/~hinton/csc2515/deeprefs.html : Reading list by Hinton
-
http://videolectures.net/mlss05us_lecun_ebmli/ : Intro to Energy based model by Yann Lecunn.
-
http://videolectures.net/kdd2014_bengio_deep_learning/?q=ICLR# : Yoshua Bengio's lecture series recorded in KDD' 14.
-
http://videolectures.net/nips09_collobert_weston_dlnl/ : Ronan Collobert lecture (it's quite old new, from 2008 but I think it is still useful).
-
https://www.youtube.com/watch?v=eixGKz0Asr8 : Lecture series by Chris Manning and Richard Socher given at NAACL 2013
-
https://www.youtube.com/watch?v=AmG4jzmBZ88 : Lecture series for DL4NLP with some practical guidelines.
-
https://blog.wtf.sg/2014/08/24/nlp-with-neural-networks/ : Blogpost on some DL applications.
-
http://lamda.nju.edu.cn/weixs/project/CNNTricks/CNNTricks.html : Some useful tricks for training Neural Networks
-
http://cs224d.stanford.edu/lectures/CS224d-Lecture11.pdf : Short notes on backprop and word embeddings
-
http://cilvr.nyu.edu/doku.php?id=courses:deeplearning2014:start : A course by Yann Lecunn on Deep Learning taught at NYU.
-
http://cs224d.stanford.edu/ : Course Specifically designed for DEEP LEARNING FOR NLP
-
https://devblogs.nvidia.com/parallelforall/understanding-natural-language-deep-neural-networks-using-torch/#.VPYhS2vB09E.reddit : NLP using Torch
-
http://www.kyunghyuncho.me/home/courses/ds-ga-3001-fall-2015 : Natural Language Understanding with Distributed Representations
-
http://mlwave.com/kaggle-ensembling-guide/ : ENSEMBLING guide. Very useful for designing practical ML systems
-
http://joanbruna.github.io/stat212b/ : TOPIC COURSE IN DEEP LEARNING by Joan Brune, UC Berkley Stats Department
-
https://medium.com/@memoakten/selection-of-resources-to-learn-artificial-intelligence-machine-learning-statistical-inference-23bc56ba655#.s5kjy7bgo : LIST of Deep Learning Talk
-
https://www.tensorflow.org/versions/r0.7/tutorials/word2vec/index.html : Tensorflow tutorial on word2vec
-
http://textminingonline.com/getting-started-with-word2vec-and-glove : Intro to word2vec and glove
-
http://rare-technologies.com/deep-learning-with-word2vec-and-gensim/ : Getting starting with word2vec and gensim.
-
http://www.lab41.org/anything2vec/ : Great explaination of word2vec and it's relation to neural networks
-
http://www.offconvex.org/2015/12/12/word-embeddings-1/ : Intuition on word embedding methods
-
http://www.offconvex.org/2016/02/14/word-embeddings-2/ : Explains the mathy stuff behind word2vec and glove (Also contains some links pointing to some other good articles on word2vec)
-
http://textminingonline.com/getting-started-with-word2vec-and-glove-in-python : Getting started with glove and word2vec with python
-
http://www.foldl.me/2014/glove-python/ : Glove implementation details in python
-
http://videolectures.net/kdd2014_salakhutdinov_deep_learning/ : Tutorial by Ruslan
-
http://www.openu.ac.il/iscol2015/downloads/ISCOL2015_submission25_e_2.pdf : Comparing various word embedding models
-
http://clic.cimec.unitn.it/marco/publications/acl2014/baroni-etal-countpredict-acl2014.pdf : Comparision between word2vec and glove
-
https://levyomer.files.wordpress.com/2014/09/neural-word-embeddings-as-implicit-matrix-factorization.pdf : word2vec as matrix factorization
-
http://research.microsoft.com/pubs/232372/CIKM14_tutorial_HeGaoDeng.pdf : Tutorial by Microsoft on DL for NLP at CIKM '14
-
http://blog.aidangomez.ca/2016/04/17/Backpropogating-an-LSTM-A-Numerical-Example/ : How backprop works in LSTM's (the so-called BPTT (back prop. through time)
-
http://www.kdnuggets.com/2015/06/rnn-tutorial-sequence-learning-recurrent-neural-networks.html
-
http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/ : Series of posts explaining RNN with some code
-
http://colah.github.io/posts/2015-08-Understanding-LSTMs/ : Great post explaining LSTMs
-
https://www.reddit.com/r/MachineLearning/comments/2zkb3b/lstm_a_search_space_odyssey_comparison_of_lstm/ : Comparision of various LSTM architectures
-
http://www.fit.vutbr.cz/~imikolov/rnnlm/ : RNN based language modelling toolkit by Tomas Micholov
-
http://www.fit.vutbr.cz/~imikolov/rnnlm/char.pdf : A new technique in solving sequence tasks which I belive will be point of interest in few years : subword based language models. Usually good at handling OOV, spelling error problems
-
http://eric-yuan.me/ner_1/ : Named Entity Recognition using CNN
-
http://arxiv.org/pdf/1511.06388.pdf : Word Sense Disambiguation using Word Embeddings
-
http://www.wildml.com/2015/12/implementing-a-cnn-for-text-classification-in-tensorflow : CNN for Text Classification
-
http://research.microsoft.com/en-us/projects/dssm/ : Deep Learning Models for learning Semantic Representation of text(document, paragraph, phrase) which can be used to solve variety of tasks including Machine Translation, Document ranking for web search etc.
-
http://www.aclweb.org/anthology/P15-1130 : Sentiment Analysis using RNN (LSTMs)
-
http://ir.hit.edu.cn/~dytang/paper/emnlp2015/emnlp2015.pdf : Sentiment Analysis using Hierarchical RNN's (GRU)
-
https://devblogs.nvidia.com/parallelforall/introduction-neural-machine-translation-with-gpus/ : Machine translation using RNN's
-
http://neon.nervanasys.com/docs/latest/lstm.html : Practical example of using LSTM for sentiment analysis
-
https://cs224d.stanford.edu/reports/HongJames.pdf : Again Sentiment Analysis using LSTMs
-
arxiv.org/pdf/1412.5335 : ICLR '15 paper on using ensembles of NN + Generative models (Language model, Naive bayes) for solving Sentiment prediction task
-
http://research.microsoft.com/pubs/214617/www2014_cdssm_p07.pdf : Extension of paper mentioned in [4] which used Convolution and max-pooling operations to learn low-dimensional semanti c representation of text
-
http://nptel.ac.in/courses/106108056/10 : JUMP TO SECTION : Uncontstrained optimization. Has tutorials on Non-convex optimization essential in deep Learning.
-
http://online.stanford.edu/course/convex-optimization-winter-2014 : Has more convex optimization part, contains basics of Optimization
-
http://videolectures.net/deeplearning2015_schmidt_smooth_finite/ : Deep Learning Summer School optimization lecture
-
https://bigquery.cloud.google.com/table/fh-bigquery:reddit_comments.2015_08?pli=1 : Reddit comments dataset
-
https://code.google.com/archive/p/word2vec/ : Links to unlabelled english corpus
-
http://github.com/brmson/dataset-sts : Variety of datasets wrapped in Python with focus on comparing two sentences, sample implementations of popular deep NN models in Keras
-
http://www.mpi-sws.org/~cristian/Cornell_Movie-Dialogs_Corpus.html : Conversation dataset (for learning seq2seq models possible leading to a chatbot kind of application)
-
https://github.com/rkadlec/ubuntu-ranking-dataset-creator : Ubuntu Dialog Corpus 5.1 : http://arxiv.org/pdf/1506.08909v3.pdf : Accompanying paper for Ubuntu dataset
-
http://www.aclweb.org/anthology/P12-2040 : Another Dialogue corpus
-
http://www.lrec-conf.org/proceedings/lrec2012/pdf/1114_Paper.pdf : yet another dialogue corpus
-
http://www.cs.technion.ac.il/~gabr/resources/data/ne_datasets.html : NER resources
-
http://linguistics.cornell.edu/language-corpora : List of NLP resources
-
https://github.com/aritter/twitter_nlp/blob/master/data/annotated/ner.txt : Annotated twitter corpus
-
https://www.aclweb.org/anthology/W/W10/W10-0712.pdf : Paper describing annotation process for NER on large email data (could not find any link, if anyone finds out please feel free to send a PR)
-
http://www.cs.cmu.edu/~mgormley/papers/napoles+gormley+van-durme.naaclw.2012.pdf : Annotated gigawords
-
http://jmcauley.ucsd.edu/data/amazon/ : Amazon review dataset (LARGE CORPUS)
-
http://curtis.ml.cmu.edu/w/courses/index.php/Amazon_product_reviews_dataset : Amazon product review dataset (available only on request)
-
http://times.cs.uiuc.edu/~wang296/Data/ : Amazon review dataset
-
https://www.yelp.com/dataset_challenge : Yelp dataset (review + images)
-
Deep Learning libraries
1.1. theano
1.2. torch
1.3. tensorflow
1.4. keras
1.5. lasagne
1.6. blocks and fuel
1.7. skflow
1.8. scicuda
-
(Automatic Differentiation tool in python)[https://github.com/HIPS/autograd]
-
(Spearmint : Hyperparamter optimization using Bayesian optimization)