Pinned Repositories
CsdBERT
Codes for "A Contrastive Self-distillation BERT with Kernel Alignment-Based Inference", published in ICCS 2023.
DeKD
Codes for "Data-Efficient Knowledge Distillation with Teacher Assistant-Based Dynamic Objective Alignment", ICCS 2024.
Eplra
Codes for "Efficient One-Shot Pruning of Large Language Models with Low-Rank Approximation", published in SMC 2024.
MetaBERT
Codes for "MetaBERT: Collaborative Meta-Learning for Accelerating BERT Inference", published in CSCWD 2023.
Pytorch
xyangyan.github.io
xyangyan's Repositories
xyangyan/MetaBERT
Codes for "MetaBERT: Collaborative Meta-Learning for Accelerating BERT Inference", published in CSCWD 2023.
xyangyan/CsdBERT
Codes for "A Contrastive Self-distillation BERT with Kernel Alignment-Based Inference", published in ICCS 2023.
xyangyan/DeKD
Codes for "Data-Efficient Knowledge Distillation with Teacher Assistant-Based Dynamic Objective Alignment", ICCS 2024.
xyangyan/Eplra
Codes for "Efficient One-Shot Pruning of Large Language Models with Low-Rank Approximation", published in SMC 2024.
xyangyan/Pytorch
xyangyan/xyangyan.github.io