/Chinese-BERT-wwm

讯飞-哈工大联合实验室Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm预训练模型)

Primary LanguagePythonApache License 2.0Apache-2.0

Watchers

No one’s watching this repository yet.