/language-detection-multilingualBERT

Using the pretrained BERT Multilingual model, a language detection model was devised. The model was fine tuned using the Wikipedia 40 Billion dataset which contains Wikipedia entries from 41 different languages. The model was trained on 16 of the languages.

Primary LanguagePythonMIT LicenseMIT

Stargazers