zju-vipa / CMILinks
[IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation
☆72Updated 3 years ago
Alternatives and similar repositories for CMI
Users that are interested in CMI are comparing it to the libraries listed below
Sorting:
- [AAAI-2022] Up to 100x Faster Data-free Knowledge Distillation☆71Updated 2 years ago
- Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)☆113Updated last year
- ☆86Updated 2 years ago
- Efficient Dataset Distillation by Representative Matching☆112Updated last year
- PyTorch implementation of paper "Dataset Distillation via Factorization" in NeurIPS 2022.☆66Updated 2 years ago
- [NeurIPS-2021] Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data☆45Updated 2 years ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆39Updated 2 years ago
- Code and pretrained models for paper: Data-Free Adversarial Distillation☆99Updated 2 years ago
- This is a method of dataset condensation, and it has been accepted by CVPR-2022.☆70Updated last year
- ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching☆101Updated last year
- (NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original …☆129Updated 8 months ago
- [NeurIPS 2022] Make Sharpness-Aware Minimization Stronger: A Sparsified Perturbation Approach -- Official Implementation☆45Updated 2 years ago
- PyTorch implementation of our Adam-NSCL algorithm from our CVPR2021 (oral) paper "Training Networks in Null Space for Continual Learning"☆62Updated 3 years ago
- Data-Free Knowledge Distillation☆21Updated 3 years ago
- Data-Free Network Quantization With Adversarial Knowledge Distillation PyTorch☆30Updated 3 years ago
- Official Code for Dataset Distillation using Neural Feature Regression (NeurIPS 2022)☆47Updated 2 years ago
- [CVPR 2024] On the Diversity and Realism of Distilled Dataset: An Efficient Dataset Distillation Paradigm☆71Updated 4 months ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆98Updated 3 years ago
- Official PyTorch implementation of PS-KD☆88Updated 2 years ago
- ☆29Updated last year
- This is the source code for Detecting Adversarial Data by Probing Multiple Perturbations Using Expected Perturbation Score (ICML2023).☆38Updated 9 months ago
- ☆23Updated last year
- Code of Data-Free Knowledge Distillation via Feature Exchange and Activation Region Constraint☆18Updated last year
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆22Updated 3 years ago
- [CVPR23] "Understanding and Improving Visual Prompting: A Label-Mapping Perspective" by Aochuan Chen, Yuguang Yao, Pin-Yu Chen, Yihua Zha…☆53Updated last year
- [ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers☆22Updated last year
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆79Updated 3 months ago
- ☆14Updated 2 years ago
- [NeurIPS'22] This is an official implementation for "Scaling & Shifting Your Features: A New Baseline for Efficient Model Tuning".☆183Updated last year
- This is the code of ICLR 2022 Oral paper 'Non-Transferable Learning: A New Approach for Model Ownership Verification and Applicability Au…☆30Updated last year