Pinned Repositories
UER-py
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
GPT2-Chinese
Chinese version of GPT2 training code, using BERT tokenizer.
hanzi
model_card
nlp-tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
TencentPretrain
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
UER-py
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
GPT2-Chinese
Chinese version of GPT2 training code, using BERT tokenizer.
hhou435's Repositories
hhou435/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
hhou435/GPT2-Chinese
Chinese version of GPT2 training code, using BERT tokenizer.
hhou435/hanzi
hhou435/nlp-tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
hhou435/TencentPretrain
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
hhou435/UER-py
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
hhou435/model_card
hhou435/transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.