[ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of some Self-Knowledge Distillation and data augmentation methods
☆110Nov 28, 2022Updated 3 years ago
Alternatives and similar repositories for Self-KD-Lib
Users that are interested in Self-KD-Lib are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- [IJCAI-2021&&TNNLS-2022] Official implementation of Hierarchical Self-supervised Augmented Knowledge Distillation☆78Mar 22, 2024Updated 2 years ago
- [AAAI-2020] Official implementations of HCGNets: Gated Convolutional Networks with Hybrid Connectivity for Image Classification☆67Sep 28, 2021Updated 4 years ago
- Official implementations of CIRKD: Cross-Image Relational Knowledge Distillation for Semantic Segmentation and implementations on Citysca…☆212Aug 29, 2025Updated 6 months ago
- [ICASSP-2021] Official implementations of Multi-View Contrastive Learning for Online Knowledge Distillation (MCL-OKD)☆27Apr 7, 2021Updated 4 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- Managed Database hosting by DigitalOcean • AdPostgreSQL, MySQL, MongoDB, Kafka, Valkey, and OpenSearch available. Automatically scale up storage and focus on building your apps.
- [ICME-2022] Official implementations of Localizing Semantic Patches for Accelerating Image Classification☆16Jul 1, 2022Updated 3 years ago
- [CVPR-2024] Official implementations of CLIP-KD: An Empirical Study of CLIP Model Distillation☆144Aug 22, 2025Updated 7 months ago
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆26Jul 14, 2023Updated 2 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆44Sep 27, 2022Updated 3 years ago
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆77Nov 21, 2023Updated 2 years ago
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆191Apr 29, 2024Updated last year
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆895Nov 5, 2023Updated 2 years ago
- This repo is the official megengine implementation of the ECCV2022 paper: Efficient One Pass Self-distillation with Zipf's Label Smoothin…☆28Oct 19, 2022Updated 3 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- Wordpress hosting with auto-scaling on Cloudways • AdFully Managed hosting built for WordPress-powered businesses that need reliable, auto-scalable hosting. Cloudways SafeUpdates now available.
- The codes for ECCV'22: Learning to Train a Point Cloud Reconstruction Network without Matching☆10Nov 16, 2022Updated 3 years ago
- Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022☆156Dec 28, 2022Updated 3 years ago
- Code for "Balanced Knowledge Distillation for Long-tailed Learning"☆29Oct 19, 2023Updated 2 years ago
- 'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)☆244Oct 10, 2023Updated 2 years ago
- The codes for RFNet: Recurrent Forward Network for Dense Point Cloud Completion☆20Jan 17, 2022Updated 4 years ago
- Union-set Multi-source Model Adaptation for Semantic Segmentation☆12Oct 24, 2022Updated 3 years ago
- Adapter-X: A Novel General Parameter-Efficient Fine-Tuning Framework for Vision☆11Jul 22, 2024Updated last year
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,426Oct 16, 2023Updated 2 years ago
- PyTorch implementation for "Gated Transfer Network for Transfer Learning"☆11Jun 3, 2019Updated 6 years ago
- End-to-end encrypted cloud storage - Proton Drive • AdSpecial offer: 40% Off Yearly / 80% Off First Month. Protect your most important files, photos, and documents from prying eyes.
- Joint learning of saliency detection and weakly supervised semantic segmentation☆25Sep 18, 2020Updated 5 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Feb 15, 2023Updated 3 years ago
- ☆128Nov 2, 2020Updated 5 years ago
- [VLDB 2024] Source code for FusionQuery: On-demand Fusion Queries over Multi-source Heterogeneous Data☆11Mar 11, 2025Updated last year
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆181Jan 29, 2022Updated 4 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆50Dec 13, 2022Updated 3 years ago
- [AAAI 2024] Understanding the Role of the Projector in Knowledge Distillation☆20Feb 13, 2024Updated 2 years ago
- ☆17Oct 22, 2022Updated 3 years ago
- Distilling Object Detectors with Feature Richness☆43Apr 15, 2022Updated 3 years ago
- GPU virtual machines on DigitalOcean Gradient AI • AdGet to production fast with high-performance AMD and NVIDIA GPUs you can spin up in seconds. The definition of operational simplicity.
- Black-box Few-shot Knowledge Distillation☆14Jul 19, 2022Updated 3 years ago
- This repository provides code for the paper ---- On Universal Black-Box Domain Adaptation.☆10Sep 2, 2021Updated 4 years ago
- A pytorch implement of scalable neural netowrks.☆23Jun 9, 2020Updated 5 years ago
- [ICLR 2023] 'Revisiting Pruning At Initialization Through The Lens of Ramanujan Graph" by Duc Hoang, Shiwei Liu, Radu Marculescu, Atlas W…☆14Aug 4, 2023Updated 2 years ago
- DataLoader for TinyImageNet Dataset☆12Sep 15, 2021Updated 4 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Jun 9, 2021Updated 4 years ago
- ☆23Aug 14, 2022Updated 3 years ago