bhheo / BSS_distillationLinks
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
☆71Updated 5 years ago
Alternatives and similar repositories for BSS_distillation
Users that are interested in BSS_distillation are comparing it to the libraries listed below
Sorting:
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆104Updated 5 years ago
- ☆51Updated 5 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆24Updated 5 years ago
- Source code accompanying our CVPR 2019 paper: "NetTailor: Tuning the architecture, not just the weights."☆53Updated 3 years ago
- ☆61Updated 5 years ago
- Unofficial pytorch implementation of Born-Again Neural Networks.☆53Updated 4 years ago
- [ICCV'19] Improving Adversarial Robustness via Guided Complement Entropy☆40Updated 5 years ago
- ☆25Updated 5 years ago
- Self-supervised Label Augmentation via Input Transformations (ICML 2020)☆105Updated 4 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated last year
- ☆17Updated 6 years ago
- PyTorch implementation for GAL.☆56Updated 5 years ago
- ☆25Updated 5 years ago
- DELTA: DEep Learning Transfer using Feature Map with Attention for Convolutional Networks https://arxiv.org/abs/1901.09229☆66Updated 4 years ago
- [CVPR 2020] Adversarial Robustness: From Self-Supervised Pre-Training to Fine-Tuning☆85Updated 3 years ago
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Updated 5 years ago
- Zero-Shot Knowledge Distillation in Deep Networks☆67Updated 3 years ago
- Accompanying code for the paper "Zero-shot Knowledge Transfer via Adversarial Belief Matching"☆141Updated 5 years ago
- [ICCV 2019 oral] Code for Semi-Supervised Learning by Augmented Distribution Alignment☆62Updated 3 years ago
- Project page for our paper: Interpreting Adversarially Trained Convolutional Neural Networks☆66Updated 5 years ago
- Deep Metric Transfer for Label Propagation with Limited Annotated Data☆49Updated 2 years ago
- Further improve robustness of mixup-trained models in inference (ICLR 2020)☆60Updated 4 years ago
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆81Updated 3 years ago
- Learning Metrics from Teachers: Compact Networks for Image Embedding (CVPR19)☆76Updated 6 years ago
- ICML'19 How does Disagreement Help Generalization against Label Corruption?☆84Updated 5 years ago
- Zero-Shot Knowledge Distillation in Deep Networks in ICML2019☆49Updated 5 years ago
- Implementation for NAT.☆58Updated 5 years ago
- Cheap distillation for convolutional neural networks.☆33Updated 6 years ago
- Role-Wise Data Augmentation for Knowledge Distillation☆19Updated 2 years ago
- Code and pretrained models for paper: Data-Free Adversarial Distillation☆99Updated 2 years ago