ArchipLab-LinfengZhang / pytorch-self-distillation-finalLinks
☆126Updated 4 years ago
Alternatives and similar repositories for pytorch-self-distillation-final
Users that are interested in pytorch-self-distillation-final are comparing it to the libraries listed below
Sorting:
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆178Updated 3 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆108Updated 5 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆97Updated 3 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆117Updated 4 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Updated 2 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆100Updated last year
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆108Updated 2 years ago
- Official PyTorch implementation of PS-KD☆88Updated 2 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆146Updated 2 years ago
- An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.☆166Updated 4 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated last year
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆40Updated 2 years ago
- [NeurIPS 2020] Balanced Meta-Softmax for Long-Tailed Visual Recognition☆140Updated 3 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated 10 months ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆272Updated 2 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Updated 4 years ago
- Improving Calibration for Long-Tailed Recognition (CVPR2021)☆148Updated 3 years ago
- When Does Label Smoothing Help?_pytorch_implementationimp☆124Updated 5 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆49Updated 2 years ago
- [CVPR 2021] Adaptive Consistency Regularization for Semi-Supervised Transfer Learning☆104Updated 3 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆26Updated 4 years ago
- Un-Mix: Rethinking Image Mixtures for Unsupervised Visual Representation Learning.☆151Updated 2 years ago
- PyTorch implementation of "Distilling the Knowledge in a Neural Network"☆67Updated 3 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Updated 2 years ago
- [ICCV 2021] Influence-balanced Loss for Imbalanced Visual Classification☆100Updated 3 years ago
- [ICLR 2021 Spotlight] Code release for "Long-tailed Recognition by Routing Diverse Distribution-Aware Experts."☆272Updated 2 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆176Updated 6 months ago
- Reproducing VID in CVPR2019 (on working)☆20Updated 5 years ago
- The official code for the paper "Delving Deep into Label Smoothing", IEEE TIP 2021☆81Updated 2 years ago
- ☆34Updated last year