yolky / RCIGLinks
☆14Updated 2 years ago
Alternatives and similar repositories for RCIG
Users that are interested in RCIG are comparing it to the libraries listed below
Sorting:
- ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching☆101Updated last year
- [CVPR2024 highlight] Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching (G-VBSM)☆28Updated 7 months ago
- Elucidated Dataset Condensation (NeurIPS 2024)☆22Updated 8 months ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆40Updated 2 years ago
- Official Implementation of paper "Distilling Long-tailed Datasets"☆14Updated 2 weeks ago
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆71Updated 3 months ago
- ☆28Updated last year
- PyTorch implementation of paper "Sparse Parameterization for Epitomic Dataset Distillation" in NeurIPS 2023.☆21Updated 11 months ago
- Data distillation benchmark☆64Updated this week
- [ICCV 2023] DataDAM: Efficient Dataset Distillation with Attention Matching☆34Updated 11 months ago
- ☆16Updated 11 months ago
- Efficient Dataset Distillation by Representative Matching☆113Updated last year
- Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)☆113Updated last year
- ☆27Updated last year
- ☆16Updated last year
- (NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original …☆128Updated 6 months ago
- Prioritize Alignment in Dataset Distillation☆20Updated 6 months ago
- ☆56Updated 5 months ago
- A pytorch implementation of CVPR24 paper "D4M: Dataset Distillation via Disentangled Diffusion Model"☆31Updated 9 months ago
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆21Updated 2 years ago
- ☆14Updated 2 years ago
- ☆86Updated 2 years ago
- ☆65Updated last year
- [AAAI-2022] Up to 100x Faster Data-free Knowledge Distillation☆70Updated 2 years ago
- Official Code for Dataset Distillation using Neural Feature Regression (NeurIPS 2022)☆47Updated 2 years ago
- FuseFL: One-Shot Federated Learning through the Lens of Causality with Progressive Model Fusion (NeurIPS 2024 Spotlight)☆12Updated 2 months ago
- Code for paper "Parameter Efficient Multi-task Model Fusion with Partial Linearization"☆21Updated 8 months ago
- [ICLR 2024] "Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality" by Xuxi Chen*, Yu Yang*, Zhangyang Wang, Baha…☆12Updated last year
- [CVPR2024] Efficient Dataset Distillation via Minimax Diffusion☆90Updated last year
- ☆23Updated last year