Pinned Repositories
Awesome-Dataset-Distillation
Awesome Dataset Distillation Papers
dataset-distillation-with-attention-labels
Implementation of "Dataset Distillation with Attention Labels for fine-tuning BERT" (accepted by ACL2023 main (short))
DiLM
Implementaiton of "DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation" (accepted by NAACL2024 Findings)".
GR-HMI
Implementation for Generative Replay inspired by Hippocampal Memory Indexing for Continual Language Learning
mtt-distillation
Official code for our CVPR '22 paper "Dataset Distillation by Matching Training Trajectories"
setup-remote
text-dataset-distillation
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
RSTParser_EACL24
Implementation of "Can we obtain significant success in RST discourse parsing by using Large Language Models?" (accepted by EACL 2024)
setup-remote
arumaekawa's Repositories
arumaekawa/dataset-distillation-with-attention-labels
Implementation of "Dataset Distillation with Attention Labels for fine-tuning BERT" (accepted by ACL2023 main (short))
arumaekawa/DiLM
Implementaiton of "DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation" (accepted by NAACL2024 Findings)".
arumaekawa/text-dataset-distillation
arumaekawa/GR-HMI
Implementation for Generative Replay inspired by Hippocampal Memory Indexing for Continual Language Learning
arumaekawa/Awesome-Dataset-Distillation
Awesome Dataset Distillation Papers
arumaekawa/mtt-distillation
Official code for our CVPR '22 paper "Dataset Distillation by Matching Training Trajectories"
arumaekawa/setup-remote
arumaekawa/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.