princetonvisualai / multimodal_dataset_distillationView external linksLinks
☆63Dec 30, 2024Updated last year
Alternatives and similar repositories for multimodal_dataset_distillation
Users that are interested in multimodal_dataset_distillation are comparing it to the libraries listed below
Sorting:
- Code for our ICML'24 on multimodal dataset distillation☆43Oct 11, 2024Updated last year
- ☆30Apr 12, 2024Updated last year
- ☆41Nov 19, 2022Updated 3 years ago
- PyTorch implementation of paper "Sparse Parameterization for Epitomic Dataset Distillation" in NeurIPS 2023.☆20Jun 28, 2024Updated last year
- Distilling Dataset into Generative Models☆54Mar 15, 2023Updated 2 years ago
- ☆15May 28, 2024Updated last year
- ☆17Jun 14, 2024Updated last year
- (NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original …☆136Nov 15, 2024Updated last year
- Official implementation of "Private Set Generation with Discriminative Information" (NeurIPS 2022)☆17Aug 14, 2023Updated 2 years ago
- The Third Place Winner in Generative Track of the ECCV 2024 DD Challenge☆10Oct 11, 2024Updated last year
- Official implementation of Dancing with Still Images: Video Distillation via Static-Dynamic Disentanglement.☆30Dec 21, 2025Updated last month
- ☆114May 22, 2023Updated 2 years ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆40Mar 25, 2023Updated 2 years ago
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆81Feb 24, 2025Updated 11 months ago
- Official PyTorch implementation for Frequency Domain-based Dataset Distillation [NeurIPS 2023]☆30May 7, 2024Updated last year
- Official PyTorch Implementation for the "Distilling Datasets Into Less Than One Image" paper.☆39Jun 6, 2024Updated last year
- ☆14Apr 21, 2023Updated 2 years ago
- Official code for our CVPR '22 paper "Dataset Distillation by Matching Training Trajectories"☆437Jul 16, 2024Updated last year
- Efficient Dataset Distillation by Representative Matching☆113Feb 28, 2024Updated last year
- A curated list of awesome papers on dataset distillation and related applications.☆1,894Feb 7, 2026Updated last week
- [ICCV2023] Dataset Quantization☆263Jan 6, 2024Updated 2 years ago
- Code for the paper "Efficient Dataset Distillation using Random Feature Approximation"☆37Feb 24, 2023Updated 2 years ago
- Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)☆116Oct 18, 2023Updated 2 years ago
- Dataset Condensation (ICLR21 and ICML21)☆544Nov 27, 2023Updated 2 years ago
- Official PyTorch implementation of "Loss-Curvature Matching for Dataset Selection and Condensation" (AISTATS 2023)☆22Mar 14, 2023Updated 2 years ago
- ☆24Oct 31, 2023Updated 2 years ago
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆22Jun 8, 2022Updated 3 years ago
- [CVPR2024] Efficient Dataset Distillation via Minimax Diffusion☆104Mar 22, 2024Updated last year
- Official Repository for Heterogeneous Models Dataset Condensation (ECCV 2024, Oral)☆10Dec 15, 2024Updated last year
- Dataset Quantization with Active Learning based Adaptive Sampling [ECCV 2024]☆10Jul 9, 2024Updated last year
- ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching☆105May 23, 2024Updated last year
- This is a method of dataset condensation, and it has been accepted by CVPR-2022.☆72Dec 12, 2023Updated 2 years ago
- ☆14Apr 25, 2023Updated 2 years ago
- Data Valuation without Training of a Model, submitted to ICLR'23☆22Dec 30, 2022Updated 3 years ago
- Official Code for Dataset Distillation using Neural Feature Regression (NeurIPS 2022)☆48Nov 12, 2022Updated 3 years ago
- [CVPR2024 highlight] Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching (G-VBSM)☆28Oct 9, 2024Updated last year
- ☆17Jul 11, 2023Updated 2 years ago
- A new simple method for dataset distillation called Randomized Truncated Backpropagation Through Time (RaT-BPTT)☆13Apr 21, 2024Updated last year
- ☆16Sep 6, 2024Updated last year