/Jan2020-NLPLab

End to end Bert Q&A MOdel tutorial on SageMaker. (Bring your own container).

Primary LanguageJupyter NotebookApache License 2.0Apache-2.0

AKO 2020 : State-of-the-art NLP with MXNet

Presenter: Rachel Hu, Wen-ming Ye, Laurens ten Cate

Abstract

Implementing natural language processing (NLP) models just got simpler and faster. We will quickly introduce BERT (Bidirectional Encoder Representation from Transformers), the state-of-the-art (SOTA) NLP model, and demonstrate how it can be used for various NLP tasks. In this chalk talk, learn how to implement NLP models using Apache MXNet and the GluonNLP Toolkit to quickly prototype products, validate new ideas, and learn SOTA NLP. We will also show how you can use GluonNLP and SageMaker to fine-tune BERT for a text classification use case and deploy the trained model. Come join us to train your NLP model onsite!

Start the lab here in the tutorial folder [https://github.com/awshlabs/Jan2020-NLPLab/tree/master/tutorial] (https://github.com/awshlabs/Jan2020-NLPLab/tree/master/tutorial).