This is a method of dataset condensation, and it has been accepted by CVPR-2022.
☆71Dec 12, 2023Updated 2 years ago
Alternatives and similar repositories for CAFE
Users that are interested in CAFE are comparing it to the libraries listed below
Sorting:
- ☆24Oct 14, 2022Updated 3 years ago
- Distilling Dataset into Generative Models☆54Mar 15, 2023Updated 2 years ago
- Efficient Dataset Distillation by Representative Matching☆114Feb 28, 2024Updated 2 years ago
- Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)☆116Oct 18, 2023Updated 2 years ago
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆22Jun 8, 2022Updated 3 years ago
- ☆15May 28, 2024Updated last year
- [ICLR 2023 Spotlight] Divide to Adapt: Mitigating Confirmation Bias for Domain Adaptation of Black-Box Predictors☆39Jul 7, 2023Updated 2 years ago
- Dataset Condensation (ICLR21 and ICML21)☆543Nov 27, 2023Updated 2 years ago
- PyTorch implementation of paper "Sparse Parameterization for Epitomic Dataset Distillation" in NeurIPS 2023.☆20Jun 28, 2024Updated last year
- ☆91Jan 22, 2023Updated 3 years ago
- [CVPR 2022 Oral] Crafting Better Contrastive Views for Siamese Representation Learning☆290Jun 27, 2022Updated 3 years ago
- ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching☆106May 23, 2024Updated last year
- Official PyTorch implementation of "Multisize Dataset Condensation" (ICLR'24 Oral)☆15Apr 18, 2024Updated last year
- Official Code for Dataset Distillation using Neural Feature Regression (NeurIPS 2022)☆48Nov 12, 2022Updated 3 years ago
- You Only Condense Once: Two Rules for Pruning Condensed Datasets (NeurIPS 2023)☆15Nov 18, 2023Updated 2 years ago
- ☆30Apr 12, 2024Updated last year
- Official Implementation of paper "Distilling Long-tailed Datasets" [CVPR 2025]☆19Aug 13, 2025Updated 6 months ago
- A curated list of awesome papers on dataset distillation and related applications.☆1,902Feb 26, 2026Updated last week
- [CVPR2024] Efficient Dataset Distillation via Minimax Diffusion☆104Mar 22, 2024Updated last year
- (NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original …☆136Nov 15, 2024Updated last year
- ☆41Nov 19, 2022Updated 3 years ago
- ☆63Dec 30, 2024Updated last year
- A pytorch implementation of CVPR24 paper "D4M: Dataset Distillation via Disentangled Diffusion Model"☆39Sep 6, 2024Updated last year
- Official PyTorch implementation of "Loss-Curvature Matching for Dataset Selection and Condensation" (AISTATS 2023)☆22Mar 14, 2023Updated 2 years ago
- Code/Models for Defending Against Universal Attacks Through Selective Feature Regeneration, CVPR 2020☆10Jul 31, 2020Updated 5 years ago
- PyTorch implementation of paper "Dataset Distillation via Factorization" in NeurIPS 2022.☆67Nov 28, 2022Updated 3 years ago
- Code for our ICML'24 on multimodal dataset distillation☆43Oct 11, 2024Updated last year
- ☆37May 28, 2025Updated 9 months ago
- Official PyTorch implementation of “Flexible Dataset Distillation: Learn Labels Instead of Images”☆41Oct 21, 2020Updated 5 years ago
- This is our code for EmotiW_2019 Student Engagement Regression Task.☆18Jul 12, 2019Updated 6 years ago
- ☆14Apr 25, 2023Updated 2 years ago
- ☆10Jul 28, 2022Updated 3 years ago
- ☆114May 22, 2023Updated 2 years ago
- Official PyTorch implementation for Frequency Domain-based Dataset Distillation [NeurIPS 2023]☆30May 7, 2024Updated last year
- Code for "DetectorGuard: Provably Securing Object Detectors against Localized Patch Hiding Attacks"☆15Jul 13, 2022Updated 3 years ago
- ☆15Apr 7, 2023Updated 2 years ago
- Soft-Label Dataset Distillation and Text Dataset Distillation☆74Nov 17, 2022Updated 3 years ago
- [ICLR 2024] "Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality" by Xuxi Chen*, Yu Yang*, Zhangyang Wang, Baha…☆15May 18, 2024Updated last year
- DataLoader for TinyImageNet Dataset☆12Sep 15, 2021Updated 4 years ago