Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)
☆119Feb 9, 2021Updated 5 years ago
Alternatives and similar repositories for attention-feature-distillation
Users that are interested in attention-feature-distillation are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- ☆10Feb 22, 2022Updated 4 years ago
- ☆19Jun 26, 2021Updated 4 years ago
- ☆34Aug 20, 2023Updated 2 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆15Oct 22, 2020Updated 5 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,427Oct 16, 2023Updated 2 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆44Sep 27, 2022Updated 3 years ago
- Paraphrasing Complex Network: Network Compression via Factor Transfer Code (NeurIPS 2018)☆20Jul 22, 2020Updated 5 years ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆277Dec 16, 2022Updated 3 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆900Nov 5, 2023Updated 2 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- ☆27Jun 20, 2021Updated 4 years ago
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,746Nov 25, 2021Updated 4 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,664May 30, 2023Updated 2 years ago
- ☆31Jun 18, 2020Updated 5 years ago
- [WACV'26] Attention as Geometric Transformation: Revisiting Feature Distillation for Semantic Segmentation☆43Apr 5, 2026Updated 2 weeks ago
- PyTorch Implementation of Matching Guided Distillation [ECCV'20]☆66Aug 7, 2021Updated 4 years ago
- ☆19Mar 12, 2024Updated 2 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆50Dec 13, 2022Updated 3 years ago
- Pytorch implementation of Adversarially Robust Distillation (ARD)☆59May 24, 2019Updated 6 years ago
- CIFS: Improving Adversarial Robustness of CNNs via Channel-wise Importance-based Feature Selection☆20Oct 12, 2021Updated 4 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression☆19Oct 12, 2021Updated 4 years ago
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Feb 20, 2020Updated 6 years ago
- Github for the conference paper GLOD-Gaussian Likelihood OOD detector☆16Apr 18, 2022Updated 4 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆27Jul 21, 2020Updated 5 years ago
- Distilling Knowledge via Intermediate Classifiers☆16Oct 3, 2021Updated 4 years ago
- [ICML 2025] Official PyTorch implementation of "NegMerge: Sign-Consensual Weight Merging for Machine Unlearning"☆14Nov 25, 2025Updated 4 months ago
- (IJCAI 2019) Knowledge Amalgamation from Heterogeneous Networks by Common Feature Learning☆10Nov 25, 2022Updated 3 years ago
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆191Apr 29, 2024Updated last year
- ☆13Aug 28, 2018Updated 7 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Codes for ICCV 2021 paper "AGKD-BML: Defense Against Adversarial Attack by Attention Guided Knowledge Distillation and Bi-directional Met…☆12Mar 3, 2022Updated 4 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆583Feb 15, 2023Updated 3 years ago
- The official implementation of ICLR2021 paper "Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and E…☆63Jun 16, 2021Updated 4 years ago
- Complementary Relation Contrastive Distillation☆17Jun 29, 2021Updated 4 years ago
- [BMVC 2022] Information Theoretic Representation Distillation☆19Oct 6, 2023Updated 2 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆180Dec 3, 2024Updated last year
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quan…☆651Mar 1, 2023Updated 3 years ago