luanyunteng / pytorch-be-your-own-teacherView external linksLinks
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
☆180Jan 29, 2022Updated 4 years ago
Alternatives and similar repositories for pytorch-be-your-own-teacher
Users that are interested in pytorch-be-your-own-teacher are comparing it to the libraries listed below
Sorting:
- ☆128Nov 2, 2020Updated 5 years ago
- ☆10Dec 15, 2018Updated 7 years ago
- A pytorch implement of scalable neural netowrks.☆23Jun 9, 2020Updated 5 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆76Jul 6, 2023Updated 2 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- ☆23Oct 27, 2019Updated 6 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆238Dec 15, 2022Updated 3 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆585Feb 15, 2023Updated 3 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆16Oct 22, 2020Updated 5 years ago
- Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression☆19Oct 12, 2021Updated 4 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Jun 9, 2021Updated 4 years ago
- [PR 2024] Official PyTorch Code for "Dual Teachers for Self-Knowledge Distillation"☆13Nov 28, 2024Updated last year
- ☆15Dec 19, 2023Updated 2 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,654May 30, 2023Updated 2 years ago
- This is an official implementation of our CVPR 2020 paper "Non-Local Neural Networks With Grouped Bilinear Attentional Transforms".☆12Jan 30, 2021Updated 5 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆31Nov 11, 2021Updated 4 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆43Sep 27, 2022Updated 3 years ago
- ☆27Feb 6, 2021Updated 5 years ago
- Generate custom text files for dataloader within UDA methods☆14May 24, 2023Updated 2 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- This is the official repo for the CVPR 2021 L2ID paper "Distill on the Go: Online knowledge distillation in self-supervised learning"☆12Nov 15, 2021Updated 4 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,425Oct 16, 2023Updated 2 years ago
- Graph Knowledge Distillation☆13Mar 6, 2020Updated 5 years ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆110Nov 28, 2022Updated 3 years ago
- Official PyTorch implementation of "Learning with Memory-based Virtual Classes for Deep Metric Learning" (ICCV 2021)☆16Oct 13, 2021Updated 4 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆414May 17, 2021Updated 4 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆182Dec 3, 2024Updated last year
- ☆61Apr 24, 2020Updated 5 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆25May 25, 2020Updated 5 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Aug 19, 2020Updated 5 years ago
- ☆19Jun 26, 2021Updated 4 years ago
- ☆16Jun 18, 2025Updated 8 months ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,742Nov 25, 2021Updated 4 years ago
- (Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)☆14May 12, 2021Updated 4 years ago
- Contrastive Learning of Image Representations with Cross-Video Cycle-Consistency☆17Dec 2, 2021Updated 4 years ago
- [CVPR 2023] This repository includes the official implementation our paper "Masked Autoencoders Enable Efficient Knowledge Distillers"☆109Jul 24, 2023Updated 2 years ago
- Role-Wise Data Augmentation for Knowledge Distillation☆19Nov 22, 2022Updated 3 years ago
- Multi-Teacher Knowledge Distillation, code for my PhD dissertation. I used knowledge distillation as a decision-fusion and compressing m…☆27May 19, 2023Updated 2 years ago