Learn PyTorch with project-based tutorials. These tutorials demonstrate modern techniques with readable code and use regular data from the internet.
Applying recurrent neural networks to natural language tasks, from classification to generation.
- Classifying Names with a Character-Level RNN
- Generating Shakespeare with a Character-Level RNN
- Generating Names with a Conditional Character-Level RNN
- Translation with a Sequence to Sequence Network and Attention
- Exploring Word Vectors with GloVe
- WIP Sentiment Analysis with a Word-Level RNN and GloVe Embeddings
- WIP Predicting discrete events with an RNN
The quickest way to run these on a fresh Linux or Mac machine is to install Anaconda:
curl -LO https://repo.continuum.io/archive/Anaconda3-4.3.0-Linux-x86_64.sh
bash Anaconda3-4.3.0-Linux-x86_64.sh
Then install PyTorch:
conda install pytorch -c soumith
Then clone this repo and start Jupyter Notebook:
git clone http://github.com/spro/practical-pytorch
cd practical-pytorch
jupyter notebook
- http://pytorch.org/ For installation instructions
- Offical PyTorch tutorials for more tutorials (some of these tutorials are included there)
- Deep Learning with PyTorch: A 60-minute Blitz to get started with PyTorch in general
- Introduction to PyTorch for former Torchies if you are a former Lua Torch user
- jcjohnson's PyTorch examples for a more in depth overview (including custom modules and autograd functions)
- The Unreasonable Effectiveness of Recurrent Neural Networks shows a bunch of real life examples
- Deep Learning, NLP, and Representations for an overview on word embeddings and RNNs for NLP
- Understanding LSTM Networks is about LSTMs work specifically, but also informative about RNNs in general
- Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
- Sequence to Sequence Learning with Neural Networks
- Neural Machine Translation by Jointly Learning to Align and Translate
- Effective Approaches to Attention-based Neural Machine Translation
If you have ideas or find mistakes please leave a note.