Pinned Repositories
Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Awesome-Model-Compression-in-Recommender-Systems
A collection of high-quality(newest) papers of model compression in the field of recommender systems
Common-datasets-in-Recommender-System-by-NTU-
This repo contains several common datasets in Recommender System, they were initially preprocessed by Nanyang Technological University(NTU)
ecnu-PGCourseShare
华东师范大学研究生课程攻略共享计划
PRecQ
The source code of paper 'Quantize Sequential Recommenders without Private Data', which is accepted by WWW'2023
transformer
Transformer: PyTorch Implementation of "Attention Is All You Need"
ecnu-PGCourseShare
华东师范大学研究生课程攻略共享计划
Sinp17's Repositories
Sinp17/Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Sinp17/Awesome-Model-Compression-in-Recommender-Systems
A collection of high-quality(newest) papers of model compression in the field of recommender systems
Sinp17/attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Sinp17/Common-datasets-in-Recommender-System-by-NTU-
This repo contains several common datasets in Recommender System, they were initially preprocessed by Nanyang Technological University(NTU)
Sinp17/ecnu-PGCourseShare
华东师范大学研究生课程攻略共享计划
Sinp17/PRecQ
The source code of paper 'Quantize Sequential Recommenders without Private Data', which is accepted by WWW'2023
Sinp17/transformer
Transformer: PyTorch Implementation of "Attention Is All You Need"