chxy95 / Deep-Mutual-Learning
An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.
☆167Updated 4 years ago
Alternatives and similar repositories for Deep-Mutual-Learning:
Users that are interested in Deep-Mutual-Learning are comparing it to the libraries listed below
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆236Updated 2 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated last year
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆175Updated 3 years ago
- ☆124Updated 4 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated 8 months ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆396Updated 3 years ago
- PyTorch implementation of "Distilling the Knowledge in a Neural Network"☆67Updated 3 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆107Updated 4 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Updated 3 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆48Updated 2 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆117Updated 4 years ago
- ☆61Updated 4 years ago
- Code and pretrained models for paper: Data-Free Adversarial Distillation☆98Updated 2 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆98Updated 11 months ago
- [NeurIPS 2020] Balanced Meta-Softmax for Long-Tailed Visual Recognition☆140Updated 3 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆587Updated 2 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆92Updated 2 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆40Updated 2 years ago
- Knowledge Amalgamation Engine☆98Updated last year
- Distilling Knowledge via Knowledge Review, CVPR 2021☆269Updated 2 years ago
- PyTorch implementation for Channel Distillation☆100Updated 4 years ago