Pinned Repositories
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
COLING2018
Python code for training and testing the model in the COLING 2018 paper: "Convolutional Neural Network for Universal Sentence Embeddings". This simple CNN model achieves strong performance on semantic similarity tasks in transfer learning setting, and it can also act as effective initialization for downstream tasks.
PKD-for-BERT-Model-Compression
pytorch implementation for Patient Knowledge Distillation for BERT Model Compression
Pretrained-Language-Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
XiaoqiJiao's Repositories
XiaoqiJiao/COLING2018
Python code for training and testing the model in the COLING 2018 paper: "Convolutional Neural Network for Universal Sentence Embeddings". This simple CNN model achieves strong performance on semantic similarity tasks in transfer learning setting, and it can also act as effective initialization for downstream tasks.
XiaoqiJiao/PKD-for-BERT-Model-Compression
pytorch implementation for Patient Knowledge Distillation for BERT Model Compression
XiaoqiJiao/Pretrained-Language-Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
XiaoqiJiao/transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.