[AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".
☆76Jul 6, 2023Updated 2 years ago
Alternatives and similar repositories for OKDDip
Users that are interested in OKDDip are comparing it to the libraries listed below
Sorting:
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆50Dec 13, 2022Updated 3 years ago
- ☆61Apr 24, 2020Updated 5 years ago
- Pytorch reproduction of Peer Collaborative Learning for Online Knowledge Distillation, AAAI2021☆21May 28, 2022Updated 3 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆31Nov 11, 2021Updated 4 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- ☆27Feb 6, 2021Updated 5 years ago
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆26Jul 14, 2023Updated 2 years ago
- Role-Wise Data Augmentation for Knowledge Distillation☆19Nov 22, 2022Updated 3 years ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆180Jan 29, 2022Updated 4 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆238Dec 15, 2022Updated 3 years ago
- MarginDistillation: distillation for margin-based softmax☆44Sep 28, 2020Updated 5 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- Graph Knowledge Distillation☆13Mar 6, 2020Updated 5 years ago
- ☆17Nov 7, 2024Updated last year
- A pytorch implement of scalable neural netowrks.☆23Jun 9, 2020Updated 5 years ago
- ☆23Oct 27, 2019Updated 6 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆25May 25, 2020Updated 5 years ago
- ☆19Jun 26, 2021Updated 4 years ago
- Switchable Online Knowledge Distillation☆19Oct 27, 2024Updated last year
- This is the code of CVPR'20 paper "Distilling Cross-Task Knowledge via Relationship Matching".☆49Feb 7, 2021Updated 5 years ago
- PyTorch implementation for Channel Distillation☆103Jun 9, 2020Updated 5 years ago
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Feb 20, 2020Updated 6 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,654May 30, 2023Updated 2 years ago
- Self-supervised Label Augmentation via Input Transformations (ICML 2020)☆107Nov 28, 2020Updated 5 years ago
- FedMD: Heterogenous Federated Learning via Model Distillation☆18Nov 1, 2019Updated 6 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Jun 9, 2021Updated 4 years ago
- ☆10Jul 5, 2019Updated 6 years ago
- TensorFlow Implementation of Deep Mutual Learning☆326Apr 10, 2018Updated 7 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆585Feb 15, 2023Updated 3 years ago
- [CVPR 2018] Official Tensorflow Code of Learning Strict Identity Mappings in Deep Residual Networks☆21Sep 27, 2020Updated 5 years ago
- Code for ICCV 2021 paper "Distilling Holistic Knowledge with Graph Neural Networks"☆44Dec 14, 2021Updated 4 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆414May 17, 2021Updated 4 years ago
- Q. Yao, H. Yang, B. Han, G. Niu, J. Kwok. Searching to Exploit Memorization Effect in Learning from Noisy Labels. ICML 2020☆23Aug 23, 2020Updated 5 years ago
- Spell and pronounce words with a neural network☆10Feb 13, 2017Updated 9 years ago
- ☆10May 9, 2019Updated 6 years ago
- Accelerating Transfer Learning with Robust Neural Nets☆11Oct 2, 2020Updated 5 years ago
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Jul 5, 2023Updated 2 years ago
- ☆12Oct 5, 2022Updated 3 years ago