SAI-h's Stars
DsaltYfish/PILL
bigdata-inha/TA-DFKD-Official
This is implementation of TA-DFKD
RUCAIBox/DAGFM
This is the official PyTorch implementation for the paper: "Directed Acyclic Graph Factorization Machines for CTR Prediction via Knowledge Distillation"
huggingface/peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
VainF/Torch-Pruning
[CVPR 2023] DepGraph: Towards Any Structural Pruning
wuyouzhuguli/SpringAll
循序渐进,学习Spring Boot、Spring Boot & Shiro、Spring Batch、Spring Cloud、Spring Cloud Alibaba、Spring Security & Spring Security OAuth2,博客Spring系列源码:https://mrbird.cc
d-gcc/LightTS
Code for the paper "LightTS: Lightweight Time Series Classification with Adaptive Ensemble"
weiaicunzai/pytorch-cifar100
Practice on cifar100(ResNet, DenseNet, VGG, GoogleNet, InceptionV3, InceptionV4, Inception-ResNetv2, Xception, Resnet In Resnet, ResNext,ShuffleNet, ShuffleNetv2, MobileNet, MobileNetv2, SqueezeNet, NasNet, Residual Attention Network, SENet, WideResNet)
open-mmlab/mmrazor
OpenMMLab Model Compression Toolbox and Benchmark.
zeromake/library
个人书籍目录,别 fork 了,里面没有书籍文件😱
zygmuntz/goodbooks-10k
Ten thousand books, six million ratings
ljrprocc/DFKD
A implementation of previous Data-Free Knowledge Distillation(DFKD) methods.
AberHu/Knowledge-Distillation-Zoo
Pytorch implementation of various Knowledge Distillation (KD) methods.
Lan1991Xu/ONE_NeurIPS2018
AnTuo1998/AE-KD
rixwew/pytorch-fm
Factorization Machine models in PyTorch
junxiaosong/AlphaZero_Gomoku
An implementation of the AlphaZero algorithm for Gomoku (also called Gobang or Five in a Row)
geek-ai/irgan
IRGAN SIGIR paper experimental code
torvalds/linux
Linux kernel source tree
Yueeeeeeee/RecSys-Extraction-Attack
[RecSys 2021] PyTorch Implementation of Black-Box Attacks on Sequential Recommenders via Data-Free Model Extraction
edervishaj/GANMF
This is the repository for our paper "GAN-based Matrix Factorization for Recommender Systems" accepted at ACM/SIGAPP Symposium on Applied Computing (SAC '22).
skgyu/SpaceshipNet
Code of Data-Free Knowledge Distillation via Feature Exchange and Activation Region Constraint
ljrprocc/DataFree
[INS] Dynamic Data-Free Knowledge Distillation by an Easy-to-Hard Strategy
zju-vipa/Fast-Datafree
[AAAI-2022] Up to 100x Faster Data-free Knowledge Distillation
Rorozhl/CA-MKD
This is the implementation for the ICASSP-2022 paper (Confidence-Aware Multi-Teacher Knowledge Distillation).
VainF/Data-Free-Adversarial-Distillation
Code and pretrained models for paper: Data-Free Adversarial Distillation
cvat-ai/cvat
Annotate better with CVAT, the industry-leading data engine for machine learning. Used and trusted by teams at any scale, for data of any scale.
yuxumin/CoRe
[ICCV 2021] Group-aware Contrastive Regression for Action Quality Assessment
Atcold/NYU-DLSP20
NYU Deep Learning Spring 2020
freefq/free
翻墙、免费翻墙、免费科学上网、免费节点、免费梯子、免费ss/v2ray/trojan节点、蓝灯、谷歌商店、翻墙梯子