GeorgeCazenavette / mtt-distillation
Official code for our CVPR '22 paper "Dataset Distillation by Matching Training Trajectories"
☆419Updated 8 months ago
Alternatives and similar repositories for mtt-distillation:
Users that are interested in mtt-distillation are comparing it to the libraries listed below
- Dataset Condensation (ICLR21 and ICML21)☆506Updated last year
- ☆113Updated last year
- Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)☆112Updated last year
- Open-source code for paper "Dataset Distillation"☆796Updated last week
- Efficient Dataset Distillation by Representative Matching☆113Updated last year
- A curated list of awesome papers on dataset distillation and related applications.☆1,604Updated last week
- ☆86Updated 2 years ago
- Code for coreset selection methods☆230Updated 2 years ago
- [TPAMI] Searching prompt modules for parameter-efficient transfer learning.☆227Updated last year
- Official Code for Dataset Distillation using Neural Feature Regression (NeurIPS 2022)☆47Updated 2 years ago
- This is a method of dataset condensation, and it has been accepted by CVPR-2022.☆69Updated last year
- ImageNet-R(endition) and DeepAugment (ICCV 2021)☆263Updated 3 years ago
- [ICCV2023] Dataset Quantization☆258Updated last year
- Official PyTorch implementation of Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion (CVPR 2020)☆498Updated 2 years ago
- Compare neural networks by their feature similarity☆358Updated last year
- ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching☆102Updated 10 months ago
- Exploring Visual Prompts for Adapting Large-Scale Models☆277Updated 2 years ago
- PyTorch implementation of paper "Dataset Distillation via Factorization" in NeurIPS 2022.☆65Updated 2 years ago
- (NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original …☆127Updated 4 months ago
- [AAAI-2022] Up to 100x Faster Data-free Knowledge Distillation☆68Updated 2 years ago
- This is the official GitHub for paper: On the Versatile Uses of Partial Distance Correlation in Deep Learning, in ECCV 2022☆173Updated last year
- A collection of parameter-efficient transfer learning papers focusing on computer vision and multimodal domains.☆401Updated 6 months ago
- PyTorch implementation for Vision Transformer[Dosovitskiy, A.(ICLR'21)] modified to obtain over 90% accuracy FROM SCRATCH on CIFAR-10 wit…☆184Updated last year
- A curated list of papers in Test-time Adaptation, Test-time Training and Source-free Domain Adaptation☆491Updated 9 months ago
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆65Updated last month
- PyTorch implementation of SimCLR: supports multi-GPU training and closely reproduces results☆203Updated 11 months ago
- Soft-Label Dataset Distillation and Text Dataset Distillation☆73Updated 2 years ago
- EasyRobust: an Easy-to-use library for state-of-the-art Robust Computer Vision Research with PyTorch.☆332Updated 9 months ago
- Distilling Dataset into Generative Models☆54Updated 2 years ago
- [CVPR2024] Efficient Dataset Distillation via Minimax Diffusion☆90Updated last year