/Pretrained-Language-Model

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

Primary LanguagePythonApache License 2.0Apache-2.0

Pretrained Language Model

This repository provides the latest pretrained language models and its related optimization techniques developed by Huawei Noah's Ark Lab.

Directory structure

  • NEZHA is a pretrained Chinese language model which achieves the state-of-the-art performances on several Chinese NLP tasks.
  • TinyBERT is a compressed BERT model which achieves 7.5x smaller and 9.4x faster on inference.