dwang181 / active-mixupLinks
Code for Active Mixup in 2020 CVPR
☆23Updated 3 years ago
Alternatives and similar repositories for active-mixup
Users that are interested in active-mixup are comparing it to the libraries listed below
Sorting:
- Data-free knowledge distillation using Gaussian noise (NeurIPS paper)☆15Updated 2 years ago
- Code for CVPR2021 paper: MOOD: Multi-level Out-of-distribution Detection☆38Updated last year
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆81Updated 3 years ago
- ☆57Updated 3 years ago
- Role-Wise Data Augmentation for Knowledge Distillation☆19Updated 2 years ago
- Code for "Transfer Learning without Knowing: Reprogramming Black-box Machine Learning Models with Scarce Data and Limited Resources". (IC…☆38Updated 4 years ago
- [CVPR 2021] "The Lottery Tickets Hypothesis for Supervised and Self-supervised Pre-training in Computer Vision Models" Tianlong Chen, Jon…☆69Updated 2 years ago
- ☆22Updated 5 years ago
- This is a code repository for paper OODformer: Out-Of-Distribution Detection Transformer☆40Updated 3 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆29Updated 4 years ago
- ☆27Updated 2 years ago
- Unofficial Pytorch Implementation Of AdversarialAutoAugment(ICLR2020)☆21Updated 4 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆71Updated 5 years ago
- [ICLR 2022]: Fast AdvProp☆35Updated 3 years ago
- Paper and Code for "Curriculum Learning by Optimizing Learning Dynamics" (AISTATS 2021)☆19Updated 3 years ago
- "Maximum-Entropy Adversarial Data Augmentation for Improved Generalization and Robustness" (NeurIPS 2020).☆50Updated 4 years ago
- [NeurIPS 2020] "Once-for-All Adversarial Training: In-Situ Tradeoff between Robustness and Accuracy for Free" by Haotao Wang*, Tianlong C…☆44Updated 3 years ago
- Code release for NeurIPS 2020 paper "Co-Tuning for Transfer Learning"☆40Updated 3 years ago
- [NeurIPS 2021] “Improving Contrastive Learning on Imbalanced Data via Open-World Sampling”, Ziyu Jiang, Tianlong Chen, Ting Chen, Zhangya…☆28Updated 3 years ago
- ☆19Updated 3 years ago
- ☆28Updated 3 years ago
- Adjust Decision Boundary for Class Imbalanced Learning☆19Updated 5 years ago
- Smooth Adversarial Training☆67Updated 4 years ago
- IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"☆41Updated 2 years ago
- [CVPR 2020] Adversarial Robustness: From Self-Supervised Pre-Training to Fine-Tuning☆85Updated 3 years ago
- Robust Contrastive Learning Using Negative Samples with Diminished Semantics (NeurIPS 2021)☆39Updated 3 years ago
- Official PyTorch implementation for our ICCV 2019 paper - Fooling Network Interpretation in Image Classification☆24Updated 5 years ago
- Pytorch implementation of Adversarially Robust Distillation (ARD)☆59Updated 6 years ago
- SKD : Self-supervised Knowledge Distillation for Few-shot Learning☆98Updated last year
- Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression☆18Updated 3 years ago