Pinned Repositories
2020_KDD_Debiasing_TOP13
KDD Cup 2020 Challenges for Modern E-Commerce Platform: Debiasing Full榜15 Half榜13
BertHub
dq-bart
DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization
drxmy
Config files for my GitHub profile.
Finding-new-discipline-through-data
gold-miner
🥇掘金翻译计划,可能是世界最大最好的英译中技术社区,最懂读者和译者的翻译平台:
google-research
Google Research
KDD_CUP_2020_Debiasing_Rush
Solution to the Debiasing Track of KDD CUP 2020
KDDCUP-2020
2020-KDDCUP,Debiasing赛道 第6名解决方案
KDDCUP20-Debiasing-Top5
drxmy's Repositories
drxmy/2020_KDD_Debiasing_TOP13
KDD Cup 2020 Challenges for Modern E-Commerce Platform: Debiasing Full榜15 Half榜13
drxmy/BertHub
drxmy/dq-bart
DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization
drxmy/drxmy
Config files for my GitHub profile.
drxmy/Finding-new-discipline-through-data
drxmy/gold-miner
🥇掘金翻译计划,可能是世界最大最好的英译中技术社区,最懂读者和译者的翻译平台:
drxmy/google-research
Google Research
drxmy/KDD_CUP_2020_Debiasing_Rush
Solution to the Debiasing Track of KDD CUP 2020
drxmy/KDDCUP-2020
2020-KDDCUP,Debiasing赛道 第6名解决方案
drxmy/KDDCUP20-Debiasing-Top5
drxmy/Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
drxmy/synpg
Code for our EACL-2021 paper "Generating Syntactically Controlled Paraphrases without Using Annotated Parallel Pairs".
drxmy/TensorFlow-Camp
drxmy/TRIME
Training Language Models with Memory Augmentation https://arxiv.org/abs/2205.12674
drxmy/unet
unet for image segmentation