/Q-A-Model-Bert-Base-Cased

The Question Answering Model presented in this repository is built on BERT (Bidirectional Encoder Representations from Transformers), a state-of-the-art transformer-based architecture tailored for natural language processing tasks.

Primary LanguagePythonMIT LicenseMIT

Stargazers

No one’s star this repository yet.