New: July 20, 2020: The tutorial now uses AllenNLP 1.0, and so should be up-to-date with the latest AllenNLP.
This tutorial is meant to teach you both how to use AllenNLP and a principled approach to doing deep learning research in NLP. The content is mirrored (and updated) on my personal site: jbarrow.ai. If you're interested in reading the latest version, you can find it there. But the code will always be stored in this repository. It consists of 10 sections, and I recommend you do them in order:
- Setup
- Building a Dataset Reader
- Building a Baseline Model
- Configuring Experiments
- Tackling Your Own Experiments
- Predictors
- Debugging [WIP]
- Advanced Modeling: Hierarchical LSTMs, CRF Decoding, and BERT [WIP]
- Digging Into the Documentation [WIP]
- Hyperparameter Search: AllenTune [WIP]
- Appendix: Migrating from AllenNLP 0.9 to 1.0 [WIP]
The tutorial makes no assumptions about familiarity with AllenNLP, and goes through using it as an experimental platform, using JSON configurations.