zju-vipa / MosaicKD
[NeurIPS-2021] Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data
☆44Updated 2 years ago
Alternatives and similar repositories for MosaicKD:
Users that are interested in MosaicKD are comparing it to the libraries listed below
- [IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation☆69Updated 2 years ago
- Official PyTorch implementation of PS-KD☆83Updated 2 years ago
- [AAAI-2022] Up to 100x Faster Data-free Knowledge Distillation☆67Updated 2 years ago
- [NeurIPS 2022] Make Sharpness-Aware Minimization Stronger: A Sparsified Perturbation Approach -- Official Implementation☆44Updated last year
- PyTorch implementation of paper "Dataset Distillation via Factorization" in NeurIPS 2022.☆63Updated 2 years ago
- Code and pretrained models for paper: Data-Free Adversarial Distillation☆95Updated 2 years ago
- The code of the paper "Minimizing the Accumulated Trajectory Error to Improve Dataset Distillation" (CVPR2023)☆40Updated last year
- ☆32Updated 3 years ago
- Data-Free Knowledge Distillation☆20Updated 2 years ago
- ☆26Updated 4 years ago
- Efficient Dataset Distillation by Representative Matching☆111Updated 11 months ago
- ☆22Updated 4 years ago
- ☆22Updated last year
- ☆84Updated 2 years ago
- [NeurIPS'22] What Makes a "Good" Data Augmentation in Knowledge Distillation -- A Statistical Perspective☆36Updated 2 years ago
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆81Updated 3 years ago
- [ICLR 2023 Spotlight] Divide to Adapt: Mitigating Confirmation Bias for Domain Adaptation of Black-Box Predictors☆40Updated last year
- Data-Free Network Quantization With Adversarial Knowledge Distillation PyTorch☆29Updated 3 years ago
- ☆29Updated 3 years ago
- ☆88Updated 2 years ago
- ☆36Updated 3 years ago
- Official repo for the WACV 2023 paper: Federated Domain Generalization for Image Recognition via Cross-Client Style Transfer.☆27Updated last year
- This is a method of dataset condensation, and it has been accepted by CVPR-2022.☆68Updated last year
- [NeurIPS 2021] “When does Contrastive Learning Preserve Adversarial Robustness from Pretraining to Finetuning?”☆48Updated 3 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆92Updated 2 years ago
- [NeurIPS 2021] "Class-Disentanglement and Applications in Adversarial Detection and Defense"☆44Updated 3 years ago
- [ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers☆23Updated 7 months ago
- ☆19Updated 2 years ago
- ☆9Updated 3 years ago
- This repository is the official implementation of Dataset Condensation with Contrastive Signals (DCC), accepted at ICML 2022.☆20Updated 2 years ago