ymcui/Chinese-BERT-wwm
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
PythonApache-2.0
Stargazers
- 2043078895
- crazySyaoranHarbin Institute of Technology
- daiw10
- donovanOngSingapore
- eva-n27
- fengzhangyinHIT-SCIR
- gmftbyGMFTBYBeijing Institute of Technology
- howardchenhdNanjing University
- ILGH
- itsucks
- jamesMuWBbeijing
- JimmiphuckworldMSc Data Science in UoG
- Just4UziHIT SZ
- jy1993
- kiminh
- lifuyang
- MansterteddyMicrosoft
- NoviScl
- ohho-zb
- PsYearNaturali -> Bytedance
- renxingkaiKwai
- ruoru─=≡Σ((( つ•̀ω•́)つ
- shikimoon
- SparkJiaoNTU-NLP & I2R, A*STAR, Singapore
- szxSparkhit
- tc-yue
- wolkerzhengHarbin
- xcfcode
- xiaobengou01beijing
- xiaoming0217
- xueyouluoHangzhou
- xxg1413@openbotai
- yefcion
- zhang921210Hangzhou, China
- zhengwshSUN YAT-SEN UNIVERSITY
- zydsBeijing