Code for our ICML'24 on multimodal dataset distillation
☆43Oct 11, 2024Updated last year
Alternatives and similar repositories for LoRS_Distill
Users that are interested in LoRS_Distill are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Preview code of ECCV'24 paper "Distill Gold from Massive Ores" (BiLP)☆25Jul 6, 2024Updated last year
- Official implementation of Dancing with Still Images: Video Distillation via Static-Dynamic Disentanglement.☆31Dec 21, 2025Updated 3 months ago
- ☆63Dec 30, 2024Updated last year
- ☆30Apr 12, 2024Updated last year
- Official implementation of ECCV 2024 paper: Take A Step Back: Rethinking the Two Stages in Visual Reasoning☆14Jun 1, 2025Updated 9 months ago
- DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- ☆15May 28, 2024Updated last year
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆81Feb 24, 2025Updated last year
- [CVPR2024 highlight] Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching (G-VBSM)☆28Oct 9, 2024Updated last year
- ☆20Feb 24, 2025Updated last year
- ☆17Jun 14, 2024Updated last year
- ☆14Apr 21, 2023Updated 2 years ago
- PyTorch implementation of paper "Sparse Parameterization for Epitomic Dataset Distillation" in NeurIPS 2023.☆20Jun 28, 2024Updated last year
- Dataset Quantization with Active Learning based Adaptive Sampling [ECCV 2024]☆10Jul 9, 2024Updated last year
- ☆29Jun 12, 2023Updated 2 years ago
- DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- The Third Place Winner in Generative Track of the ECCV 2024 DD Challenge☆10Oct 11, 2024Updated last year
- Distilling Dataset into Generative Models☆54Mar 15, 2023Updated 3 years ago
- (NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original …☆138Nov 15, 2024Updated last year
- You Only Condense Once: Two Rules for Pruning Condensed Datasets (NeurIPS 2023)