dataset-distillation
There are 14 repositories under dataset-distillation topic.
Emory-Melody/awesome-graph-reduction
[IJCAI 2024] Papers about graph reduction including graph coarsening, graph condensation, graph sparsification, graph summarization, etc.
ChandlerBang/GCond
[ICLR'22] [KDD'22] [IJCAI'24] Implementation of "Graph Condensation for Graph Neural Networks"
VILA-Lab/SRe2L
(NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original ImageNet-1K val set.
snu-mllab/Efficient-Dataset-Condensation
Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)
NUS-HPC-AI-Lab/DATM
ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching
AsafShul/PoDD
Official PyTorch Implementation for the "Distilling Datasets Into Less Than One Image" paper.
liuyugeng/baadd
Code for Backdoor Attacks Against Dataset Distillation
XYGaoG/Graph-Condensation-Papers
Awesome Graph Condensation Papers
VITA-Group/ProgressiveDD
[ICLR 2024] "Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality" by Xuxi Chen*, Yu Yang*, Zhangyang Wang, Baharan Mirzasoleiman
EnnengYang/An-Efficient-Dataset-Condensation-Plugin
An Efficient Dataset Condensation Plugin and Its Application to Continual Learning. NeurIPS, 2023.
kghandour/dd3d
Dataset Distillation on 3D Point Clouds using Gradient Matching
enkiwang/Dataset-distillation-papers
A collection of dataset distillation papers.
mashijie1028/TrustDD
Code for our paper "Towards Trustworthy Dataset Distillation" (Pattern Recognition 2025)
zeyuanyin/SRe2L-CL
Continual Learning code for SRe2L paper (NeurIPS 2023 spotlight)