Natural Language Processing (NLP) related studies and practice
-
Efficient Representation of Word Representations in Vector Space
-
Ditributed Representations of Words and Phrases and their Compositionality
-
Transformer | Attention is All You Need
-
T5 | Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
-
CTRL | A Conditional Transformer Language Model For Controllable Generation
-
OpenAI GPT | Improving Language Understanding by Generative Pre-Training
-
BERT | BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
-
RoBERTa | RoBERTa: A Robustly Optimized BERT Pretraining Approach
-
ALBERT | ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
-
XLNet | XLNet: Generalized Autoregressive Pretraining for Language Understanding
-
Supervised Learning of Universal Sentence Representations from Natural Language Inference Data
-
Graph Convolutional Networks for Text Classification
- tf2 implementation link here
- Pytorch (1.5.x)
- Tensorflow (2.x.x)
- transformers (3.x.x)
- nltk