ZhiluZhang123 / neurips_2020_distillationLinks
Code for "Self-Distillation as Instance-Specific Label Smoothing"
☆16Updated 5 years ago
Alternatives and similar repositories for neurips_2020_distillation
Users that are interested in neurips_2020_distillation are comparing it to the libraries listed below
Sorting:
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆82Updated 4 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆76Updated 2 years ago
- Official PyTorch implementation of "Co-Mixup: Saliency Guided Joint Mixup with Supermodular Diversity" (ICLR'21 Oral)☆105Updated 4 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Updated 5 years ago
- Adjust Decision Boundary for Class Imbalanced Learning☆19Updated 5 years ago
- AdaShare: Learning What To Share For Efficient Deep Multi-Task Learning☆114Updated 4 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆71Updated 6 years ago
- When Does Label Smoothing Help?_pytorch_implementationimp☆126Updated 5 years ago
- Self-supervised Label Augmentation via Input Transformations (ICML 2020)☆107Updated 5 years ago
- Code for the paper "M2m: Imbalanced Classification via Major-to-minor Translation" (CVPR 2020)☆95Updated 4 years ago
- Official implementation of Auxiliary Learning by Implicit Differentiation [ICLR 2021]☆86Updated last year
- ☆34Updated 2 years ago
- PyTorch implementation of the paper "SuperLoss: A Generic Loss for Robust Curriculum Learning" in NIPS 2020.☆29Updated 4 years ago
- [CVPR 2021] Code for "Augmentation Strategies for Learning with Noisy Labels".☆113Updated 4 years ago
- ☆61Updated 5 years ago
- SpotTune: Transfer Learning through Adaptive Fine-tuning☆91Updated 6 years ago
- ☆57Updated 4 years ago
- Reproduce Results for ICCV2019 "Symmetric Cross Entropy for Robust Learning with Noisy Labels" https://arxiv.org/abs/1908.06112☆190Updated 5 years ago
- SKD : Self-supervised Knowledge Distillation for Few-shot Learning☆102Updated 2 years ago
- ☆95Updated 5 years ago
- Accompanying code for the paper "Zero-shot Knowledge Transfer via Adversarial Belief Matching"☆144Updated 5 years ago
- PyTorch implementation of the paper "Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels" in NIPS 2018☆130Updated 6 years ago
- Un-Mix: Rethinking Image Mixtures for Unsupervised Visual Representation Learning.☆151Updated 3 years ago
- PyTorch implementation of "Contrast to Divide: self-supervised pre-training for learning with noisy labels"☆71Updated 4 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Updated 3 years ago
- IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"☆42Updated 2 years ago
- Code release for NeurIPS 2020 paper "Co-Tuning for Transfer Learning"☆40Updated 3 years ago
- PyTorch implementation of Self-supervised Contrastive Regularization for DG (SelfReg) [ICCV2021]☆78Updated 3 years ago
- [NeurIPS 2020] “ Robust Pre-Training by Adversarial Contrastive Learning”, Ziyu Jiang, Tianlong Chen, Ting Chen, Zhangyang Wang☆116Updated 4 years ago
- PyTorch implementation of consistency regularization methods for semi-supervised learning☆79Updated 5 years ago