Pinned Repositories
albert
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
albert_zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
BertHub
c3
Investigating Prior Knowledge for Challenging Chinese Machine Reading Comprehension
Chinese-ELECTRA
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
dcmn
Dual Co-Matching Network for Machine Reading Comprehension
DeBERTa
The implementation of DeBERTa
duma_code
codes for DUMA: Reading Comprehension with Transposition Thinking
ERNIE
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
ext_data_for_haihua_ai_mrc
qixy13's Repositories
qixy13/albert
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
qixy13/albert_zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
qixy13/BertHub
qixy13/c3
Investigating Prior Knowledge for Challenging Chinese Machine Reading Comprehension
qixy13/Chinese-ELECTRA
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
qixy13/dcmn
Dual Co-Matching Network for Machine Reading Comprehension
qixy13/DeBERTa
The implementation of DeBERTa
qixy13/duma_code
codes for DUMA: Reading Comprehension with Transposition Thinking
qixy13/ERNIE
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
qixy13/ext_data_for_haihua_ai_mrc
qixy13/FlexNeuART
Flexible classic and NeurAl Retrieval Toolkit
qixy13/Funnel-Transformer
qixy13/go-best-practice
Go语言实战: 编写可维护Go语言代码建议
qixy13/go-spring
基于 IoC 的 Go 后端一站式开发框架 🚀
qixy13/guwenbert
GuwenBERT: 古文预训练语言模型(古文BERT) A Pre-trained Language Model for Classical Chinese (Literary Chinese)
qixy13/MacBERT
Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT)
qixy13/Megatron-LM
Ongoing research training transformer models at scale
qixy13/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.