[ICLR 2024] "Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality" by Xuxi Chen*, Yu Yang*, Zhangyang Wang, Baharan Mirzasoleiman
☆15May 18, 2024Updated last year
Alternatives and similar repositories for ProgressiveDD
Users that are interested in ProgressiveDD are comparing it to the libraries listed below
Sorting:
- PyTorch implementation of paper "Sparse Parameterization for Epitomic Dataset Distillation" in NeurIPS 2023.☆20Jun 28, 2024Updated last year
- Official PyTorch implementation of "Multisize Dataset Condensation" (ICLR'24 Oral)☆15Apr 18, 2024Updated last year
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm