/alephbertgimmel

AlephBertGimmel - Modern Hebrew pretrained BERT model with a 128K token vocabulary.

Primary LanguagePythonCreative Commons Zero v1.0 UniversalCC0-1.0

alephbertgimmel

AlephBertGimmel - Modern Hebrew pretrained BERT model with a 128K token vocabulary.

When using AlephBertGimmel, please reference:

Eylon Guetta, Avi Shmidman, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Joshua Guedalia, Moshe Koppel, Dan Bareket, Amit Seker and Reut Tsarfaty, "Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All", Nov 2022 [http://arxiv.org/abs/2211.15199]