ilia10000 / dataset-distillationView external linksLinks
Soft-Label Dataset Distillation and Text Dataset Distillation
☆74Nov 17, 2022Updated 3 years ago
Alternatives and similar repositories for dataset-distillation
Users that are interested in dataset-distillation are comparing it to the libraries listed below
Sorting:
- Official PyTorch implementation of “Flexible Dataset Distillation: Learn Labels Instead of Images”☆41Oct 21, 2020Updated 5 years ago
- Open-source code for paper "Dataset Distillation"☆821Jun 17, 2025Updated 7 months ago
- Implementation of "Dataset Distillation with Attention Labels for fine-tuning BERT" (accepted by ACL2023 main (short))☆23Jan 8, 2024Updated 2 years ago
- ☆41Nov 19, 2022Updated 3 years ago
- ☆91Jan 22, 2023Updated 3 years ago
- Implementaiton of "DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation" (accepted by NAACL2024 Findings)".☆28Feb 10, 2025Updated last year
- ☆24Oct 31, 2023Updated 2 years ago
- A curated list of awesome papers on dataset distillation and related applications.☆1,895Updated this week
- PyTorch implementation of paper "Sparse Parameterization for Epitomic Dataset Distillation" in NeurIPS 2023.☆20Jun 28, 2024Updated last year
- Official code for our CVPR '22 paper "Dataset Distillation by Matching Training Trajectories"☆437Jul 16, 2024Updated last year
- Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)☆116Oct 18, 2023Updated 2 years ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆40Mar 25, 2023Updated 2 years ago
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆22Jun 8, 2022Updated 3 years ago
- ☆13Mar 25, 2022Updated 3 years ago
- The contrastive token loss function for reducing generative repetition of autoregressive neural language models.☆13May 11, 2022Updated 3 years ago
- Unofficial PyTorch implementation of "FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence"☆48Jun 22, 2022Updated 3 years ago
- This is a method of dataset condensation, and it has been accepted by CVPR-2022.☆72Dec 12, 2023Updated 2 years ago
- Bias Benchmark for Natural Language Inference. Code repo for the Findings of NAACL 2022 paper "On Measuring Social Biases in Prompt-Based…☆15Apr 28, 2022Updated 3 years ago
- A codebase for ACL 2023 paper: Mitigating Label Biases for In-context Learning☆10Aug 4, 2023Updated 2 years ago
- Linear Mode Connectivity in Multitask and Continual Learning: https://arxiv.org/abs/2010.04495☆12Oct 12, 2020Updated 5 years ago
- Code for SelfAugment☆27Dec 16, 2020Updated 5 years ago
- [CVPR2024 highlight] Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching (G-VBSM)☆28Oct 9, 2024Updated last year
- ☆21Jun 22, 2025Updated 7 months ago
- Code for the EMNLP2020 long paper "Lifelong Language Knowledge Distillation" https://arxiv.org/abs/2010.02123☆12Jul 13, 2021Updated 4 years ago
- ☆30Mar 19, 2021Updated 4 years ago
- ☆13Apr 12, 2018Updated 7 years ago
- In-BoXBART: Get Instructions into Biomedical Multi-task Learning☆14Aug 23, 2022Updated 3 years ago
- ☆15May 28, 2024Updated last year
- Pytorch implementation of CVPR2021 paper: SuperMix: Supervising the Mixing Data Augmentation☆92Jan 6, 2022Updated 4 years ago
- ☆17Jun 14, 2024Updated last year
- Repository of Jupyter Notebooks on Colab, Binder and Huggingface for Bio, Chemistry and Physics☆13Jul 29, 2023Updated 2 years ago
- DataLoader for TinyImageNet Dataset☆12Sep 15, 2021Updated 4 years ago
- ☆35May 9, 2025Updated 9 months ago
- Efficient Dataset Distillation by Representative Matching☆113Feb 28, 2024Updated last year
- ☆19Updated this week
- ☆15Nov 12, 2021Updated 4 years ago
- Implementation for the paper "Adversarial Continual Learning" in PyTorch.☆255May 25, 2023Updated 2 years ago
- This is the official repository for Batch Level Distillation (BLD)☆15Jan 25, 2021Updated 5 years ago
- Prioritize Alignment in Dataset Distillation☆21Dec 3, 2024Updated last year