The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)
☆40Mar 25, 2023Updated 3 years ago
Alternatives and similar repositories for FTD-distillation
Users that are interested in FTD-distillation are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Prioritize Alignment in Dataset Distillation☆21Dec 3, 2024Updated last year
- Efficient Dataset Distillation by Representative Matching☆114Feb 28, 2024Updated 2 years ago
- [CVPR2024 highlight] Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching (G-VBSM)☆28Oct 9, 2024Updated last year
- ☆42Sep 5, 2023Updated 2 years ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆18Mar 21, 2023Updated 3 years ago
- Data distillation benchmark☆72Jun 13, 2025Updated 9 months ago
- ☆15May 28, 2024Updated last year
- Exploiting Inter-sample and Inter-feature Relations in Dataset Distillation (CVPR24)☆11Jun 16, 2024Updated last year
- ☆17Jun 14, 2024Updated last year
- ☆13Nov 25, 2021Updated 4 years ago
- Official PyTorch implementation of "Loss-Curvature Matching for Dataset Selection and Condensation" (AISTATS 2023)☆22Mar 14, 2023Updated 3 years ago
- Preview code of ECCV'24 paper "Distill Gold from Massive Ores" (BiLP)☆25Jul 6, 2024Updated last year
- [CVPR2024] Efficient Dataset Distillation via Minimax Diffusion☆104Mar 22, 2024Updated 2 years ago
- Official Code for Dataset Distillation using Neural Feature Regression (NeurIPS 2022)☆48Nov 12, 2022Updated 3 years ago
- Dataset Condensation (ICLR21 and ICML21)☆543Nov 27, 2023Updated 2 years ago
- You Only Condense Once: Two Rules for Pruning Condensed Datasets (NeurIPS 2023)☆15Nov 18, 2023Updated 2 years ago
- (NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original …☆138Nov 15, 2024Updated last year
- PyTorch implementation of paper "Dataset Distillation via Factorization" in NeurIPS 2022.☆67Nov 28, 2022Updated 3 years ago
- ☆114May 22, 2023Updated 2 years ago
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆81Feb 24, 2025Updated last year
- ☆14Apr 21, 2023Updated 2 years ago
- Code for the paper "Efficient Dataset Distillation using Random Feature Approximation"☆37Feb 24, 2023Updated 3 years ago
- ☆30Apr 12, 2024Updated last year
- Self-Supervised Dataset Distillation for Transfer Learning☆17Apr 10, 2024Updated last year
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆22Jun 8, 2022Updated 3 years ago
- ☆91Jan 22, 2023Updated 3 years ago
- ☆29Jun 12, 2023Updated 2 years ago
- [USENIX Security 2022] Mitigating Membership Inference Attacks by Self-Distillation Through a Novel Ensemble Architecture☆16Aug 29, 2022Updated 3 years ago
- Distilling Dataset into Generative Models☆54Mar 15, 2023Updated 3 years ago
- ☆20Feb 24, 2025Updated last year
- A curated list of awesome papers on dataset distillation and related applications.☆1,913Updated this week
- ☆14Apr 25, 2023Updated 2 years ago
- ☆41Nov 19, 2022Updated 3 years ago
- The official implementation of paper "Spanning Training Progress: Temporal Dual-Depth Scoring (TDDS) for Enhanced Dataset Pruning" (CVPR …☆22Aug 20, 2024Updated last year
- ☆24Oct 31, 2023Updated 2 years ago
- Lossless Training Speed Up by Unbiased Dynamic Data Pruning☆345Sep 24, 2024Updated last year
- INTERSPEECH2023: Target Active Speaker Detection with Audio-visual Cues☆58May 29, 2023Updated 2 years ago
- Official PyTorch implementation of “Flexible Dataset Distillation: Learn Labels Instead of Images”☆41Oct 21, 2020Updated 5 years ago
- Official PyTorch implementation for Frequency Domain-based Dataset Distillation [NeurIPS 2023]☆31May 7, 2024Updated last year