This repository provides the latest pretrained language models and its related optimization techniques developed by Huawei Noah's Ark Lab.
- NEZHA-TensorFlow is a pretrained Chinese language model which achieves the state-of-the-art performances on several Chinese NLP tasks developed by TensorFlow.
- NEZHA-PyTorch is the PyTorch version of NEZHA.
- TinyBERT is a compressed BERT model which achieves 7.5x smaller and 9.4x faster on inference.