[ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of some Self-Knowledge Distillation and data augmentation methods
☆110Nov 28, 2022Updated 3 years ago
Alternatives and similar repositories for Self-KD-Lib
Users that are interested in Self-KD-Lib are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- [IJCAI-2021&&TNNLS-2022] Official implementation of Hierarchical Self-supervised Augmented Knowledge Distillation☆78Mar 22, 2024Updated 2 years ago
- [AAAI-2020] Official implementations of HCGNets: Gated Convolutional Networks with Hybrid Connectivity for Image Classification☆67Sep 28, 2021Updated 4 years ago
- Official implementations of CIRKD: Cross-Image Relational Knowledge Distillation for Semantic Segmentation and implementations on Citysca…☆212Aug 29, 2025Updated 7 months ago
- [AAAI-2022 Oral] Official implementations of MCL: Mutual Contrastive Learning for Visual Representation Learning☆71Mar 1, 2022Updated 4 years ago
- [ICASSP-2021] Official implementations of Multi-View Contrastive Learning for Online Knowledge Distillation (MCL-OKD)☆27Apr 7, 2021Updated 5 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- [ICME-2022] Official implementations of Localizing Semantic Patches for Accelerating Image Classification☆16Jul 1, 2022Updated 3 years ago
- [CVPR-2024] Official implementations of CLIP-KD: An Empirical Study of CLIP Model Distillation☆146Aug 22, 2025Updated 7 months ago
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆27Jul 14, 2023Updated 2 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆44Sep 27, 2022Updated 3 years ago
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆77Nov 21, 2023Updated 2 years ago
- Class Similarity Weighted Knowledge Distillation for Continual Semantic Segmentation☆19Jul 30, 2024Updated last year
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆191Apr 29, 2024Updated last year
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆898Nov 5, 2023Updated 2 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- This repo is the official megengine implementation of the ECCV2022 paper: Efficient One Pass Self-distillation with Zipf's Label Smoothin…☆28Oct 19, 2022Updated 3 years ago
- Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression☆19Oct 12, 2021Updated 4 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- The codes for ECCV'22: Learning to Train a Point Cloud Reconstruction Network without Matching☆10Nov 16, 2022Updated 3 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆156Dec 28, 2022Updated 3 years ago
- Code for "Balanced Knowledge Distillation for Long-tailed Learning"☆29Oct 19, 2023Updated 2 years ago
- The codes for RFNet: Recurrent Forward Network for Dense Point Cloud Completion☆20Jan 17, 2022Updated 4 years ago
- Union-set Multi-source Model Adaptation for Semantic Segmentation☆12Oct 24, 2022Updated 3 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,427Oct 16, 2023Updated 2 years ago
- AI Agents on DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- Joint learning of saliency detection and weakly supervised semantic segmentation☆25Sep 18, 2020Updated 5 years ago
- ☆13Jul 19, 2022Updated 3 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Feb 15, 2023Updated 3 years ago
- [VLDB 2024] Source code for FusionQuery: On-demand Fusion Queries over Multi-source Heterogeneous Data☆11Mar 11, 2025Updated last year
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆182Jan 29, 2022Updated 4 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆50Dec 13, 2022Updated 3 years ago
- [AAAI 2024] Understanding the Role of the Projector in Knowledge Distillation☆20Feb 13, 2024Updated 2 years ago
- Black-box Few-shot Knowledge Distillation☆14Jul 19, 2022Updated 3 years ago
- Distilling Object Detectors with Feature Richness☆43Apr 15, 2022Updated 4 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- [ICLR 2023] 'Revisiting Pruning At Initialization Through The Lens of Ramanujan Graph" by Duc Hoang, Shiwei Liu, Radu Marculescu, Atlas W…☆14Aug 4, 2023Updated 2 years ago
- ☆17Jul 10, 2022Updated 3 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Jun 9, 2021Updated 4 years ago
- ☆23Aug 14, 2022Updated 3 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆15Oct 22, 2020Updated 5 years ago
- Deep Structured Instance Graph for Distilling Object Detectors (ICCV 2021)☆36Oct 17, 2021Updated 4 years ago
- [AAAI-2025 Oral] Official implementation of Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual Recognition☆41Jan 13, 2025Updated last year