Reza-Marzban/BERT-Tutorial
BERT, or Bidirectional Encoder Representations from Transformers, is a new method of pre-training language representations which obtains state-of-the-art results on a wide array of Natural Language Processing (NLP) tasks.
Stargazers
No one’s star this repository yet.