ShenDezhou/Chinese-PreTrained-BERT
We released BERT-wwm, a Chinese pre-training model based on Whole Word Masking technology, and models closely related to this technology. 我们发布了基于全词遮罩(Whole Word Masking)技术的中文预训练模型BERT-wwm,以及与此技术密切相关的模型
PythonNOASSERTION
Stargazers
- AzureRossiUniversity of Science and Technology of China
- BestJex
- CaoYongwang
- chaoli1024
- chaotiaor
- cowarderChina
- duanjiding
- Eggwardhan
- Elasine
- FCInter
- hhwowen
- hulele03
- iBINLING
- JeremySun1224Nanjing University of Information Science & Technology, NUIST
- Kast-Nora
- kuustudio
- lidan-ya
- linxiang81
- liyongkang123University of Amsterdam
- longswordinhand
- MaydaytyhBeijing Institute of Technology
- MuBai-Argo北京
- NeihSyin
- O5-7
- pingaowang@TuninsightAI
- ranchengMontreal
- strangehu
- SylarairTongji University
- vincentlux
- wilhelmjung@deepnova
- Xherorl
- xzy-1115
- YaoYongtaoUniversity of Delaware
- yechuze
- zero0r1OnTheList
- zkyoma