ZhiluZhang123 / neurips_2020_distillationLinks
Code for "Self-Distillation as Instance-Specific Label Smoothing"
☆16Updated 4 years ago
Alternatives and similar repositories for neurips_2020_distillation
Users that are interested in neurips_2020_distillation are comparing it to the libraries listed below
Sorting:
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆81Updated 3 years ago
- ☆28Updated 3 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆25Updated 5 years ago
- Adjust Decision Boundary for Class Imbalanced Learning☆19Updated 5 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆108Updated 5 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated last year
- Official implementation for: "Multi-Objective Interpolation Training for Robustness to Label Noise"☆39Updated 3 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Updated 4 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆49Updated 2 years ago
- PyTorch implementation of the paper "SuperLoss: A Generic Loss for Robust Curriculum Learning" in NIPS 2020.☆29Updated 4 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆26Updated 4 years ago
- ☆36Updated 3 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆29Updated 4 years ago
- On the Importance of Gradients for Detecting Distributional Shifts in the Wild☆56Updated 2 years ago
- PyTorch implementation of "Contrast to Divide: self-supervised pre-training for learning with noisy labels"☆71Updated 4 years ago
- Code for "Balanced Knowledge Distillation for Long-tailed Learning"☆27Updated last year
- Sinkhorn Label Allocation is a label assignment method for semi-supervised self-training algorithms. The SLA algorithm is described in fu…☆53Updated 4 years ago
- Official PyTorch implementation of "Co-Mixup: Saliency Guided Joint Mixup with Supermodular Diversity" (ICLR'21 Oral)☆103Updated 3 years ago
- Graph Knowledge Distillation☆13Updated 5 years ago
- ☆27Updated 4 years ago
- [ICASSP 2020] Code release of paper 'Heterogeneous Domain Generalization via Domain Mixup'☆26Updated 4 years ago
- Official code for "Mean Shift for Self-Supervised Learning"☆57Updated 3 years ago
- Code release for NeurIPS 2020 paper "Co-Tuning for Transfer Learning"☆40Updated 3 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆40Updated 2 years ago
- ☆57Updated 4 years ago
- This is a public repository for:☆38Updated 3 years ago
- ☆46Updated 3 years ago
- PyTorch Implementation of Temporal Output Discrepancy for Active Learning, ICCV 2021☆41Updated 2 years ago
- ☆61Updated 5 years ago