bolianchen / Data-Free-Learning-of-Student-NetworksLinks
☆22Updated 4 years ago
Alternatives and similar repositories for Data-Free-Learning-of-Student-Networks
Users that are interested in Data-Free-Learning-of-Student-Networks are comparing it to the libraries listed below
Sorting:
- Code and pretrained models for paper: Data-Free Adversarial Distillation☆99Updated 2 years ago
- Data-Free Network Quantization With Adversarial Knowledge Distillation PyTorch☆29Updated 3 years ago
- [NeurIPS-2021] Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data☆44Updated 2 years ago
- [IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation☆72Updated 3 years ago
- Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)☆113Updated last year
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated 10 months ago
- ☆27Updated 4 years ago
- [AAAI-2022] Up to 100x Faster Data-free Knowledge Distillation☆70Updated 2 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated last year
- Implementation of Effective Sparsification of Neural Networks with Global Sparsity Constraint☆31Updated 3 years ago
- [ICLR 2021] "Robust Overfitting may be mitigated by properly learned smoothening" by Tianlong Chen*, Zhenyu Zhang*, Sijia Liu, Shiyu Chan…☆47Updated 3 years ago
- Code and checkpoints of compressed networks for the paper titled "HYDRA: Pruning Adversarially Robust Neural Networks" (NeurIPS 2020) (ht…☆92Updated 2 years ago
- Data-free knowledge distillation using Gaussian noise (NeurIPS paper)☆15Updated 2 years ago
- ZSKD with PyTorch☆31Updated last year
- ☆30Updated 4 years ago
- ☆107Updated 3 years ago
- [NeurIPS 2021] "Class-Disentanglement and Applications in Adversarial Detection and Defense"☆45Updated 3 years ago
- This is a repo of Out-of-Distribution Generalization via Decomposed Feature Representation and Semantic Augmentation.☆18Updated 3 years ago
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆81Updated 3 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆48Updated 2 years ago
- Official PyTorch implementation of “Flexible Dataset Distillation: Learn Labels Instead of Images”☆42Updated 4 years ago
- Code for "Improving Robustness of Vision Transformers by Reducing Sensitivity to Patch Corruptions"☆11Updated last year
- ☆30Updated 3 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆71Updated 5 years ago
- (IJCAI 2019) Knowledge Amalgamation from Heterogeneous Networks by Common Feature Learning☆10Updated 2 years ago
- [CVPR 2021] "The Lottery Tickets Hypothesis for Supervised and Self-supervised Pre-training in Computer Vision Models" Tianlong Chen, Jon…☆69Updated 2 years ago
- [NeurIPS 2021] “When does Contrastive Learning Preserve Adversarial Robustness from Pretraining to Finetuning?”☆48Updated 3 years ago
- Data-Free Knowledge Distillation☆21Updated 3 years ago
- Code for Active Mixup in 2020 CVPR☆23Updated 3 years ago
- An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.☆166Updated 4 years ago