Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)
☆119Feb 9, 2021Updated 5 years ago
Alternatives and similar repositories for attention-feature-distillation
Users that are interested in attention-feature-distillation are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- ☆34Aug 20, 2023Updated 2 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- PyTorch implementation for Channel Distillation☆103Jun 9, 2020Updated 5 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting with the flexibility to host WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Cloudways by DigitalOcean.
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,428Oct 16, 2023Updated 2 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆44Sep 27, 2022Updated 3 years ago
- Paraphrasing Complex Network: Network Compression via Factor Transfer Code (NeurIPS 2018)☆20Jul 22, 2020Updated 5 years ago
- ☆27Jun 28, 2022Updated 3 years ago
- ☆114Apr 21, 2021Updated 4 years ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆278Dec 16, 2022Updated 3 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆895Nov 5, 2023Updated 2 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆423Jun 23, 2020Updated 5 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,745Nov 25, 2021Updated 4 years ago
- ☆31Jun 18, 2020Updated 5 years ago
- ☆19Mar 12, 2024Updated 2 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆50Dec 13, 2022Updated 3 years ago
- Pytorch implementation of Adversarially Robust Distillation (ARD)☆59May 24, 2019Updated 6 years ago
- CIFS: Improving Adversarial Robustness of CNNs via Channel-wise Importance-based Feature Selection☆20Oct 12, 2021Updated 4 years ago
- Github for the conference paper GLOD-Gaussian Likelihood OOD detector☆16Apr 18, 2022Updated 3 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆27Jul 21, 2020Updated 5 years ago
- Distilling Knowledge via Intermediate Classifiers☆16Oct 3, 2021Updated 4 years ago
- End-to-end encrypted cloud storage - Proton Drive • AdSpecial offer: 40% Off Yearly / 80% Off First Month. Protect your most important files, photos, and documents from prying eyes.
- [ICML 2025] Official PyTorch implementation of "NegMerge: Sign-Consensual Weight Merging for Machine Unlearning"☆15Nov 25, 2025Updated 4 months ago
- (IJCAI 2019) Knowledge Amalgamation from Heterogeneous Networks by Common Feature Learning☆10Nov 25, 2022Updated 3 years ago
- ☆23May 6, 2022Updated 3 years ago
- reproduction of the CVPR'21 paper Distilling Knowledge via Knowledge Review for the ML Reproducibility Challenge 2021☆10Apr 16, 2022Updated 3 years ago
- Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"☆191Apr 29, 2024Updated last year
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Feb 15, 2023Updated 3 years ago
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods p…☆1,604Dec 24, 2025Updated 3 months ago
- [BMVC 2022] Information Theoretic Representation Distillation☆19Oct 6, 2023Updated 2 years ago
- This repository contains all code and data for the Inside Out Visual Place Recognition task☆23Nov 24, 2021Updated 4 years ago
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Aug 19, 2020Updated 5 years ago
- Implementation of paper 'Reversing the Forget-Retain Objectives: An Efficient LLM Unlearning Framework from Logit Difference' [NeurIPS'24…☆26Jun 14, 2024Updated last year
- ☆20Jan 16, 2026Updated 2 months ago
- ☆10Nov 18, 2024Updated last year
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆110Nov 28, 2022Updated 3 years ago
- Knowledge Amalgamation Engine☆99Feb 28, 2024Updated 2 years ago
- [ICML2024] "FedLMT: Tackling System Heterogeneity of Federated Learning via Low-Rank Model Training with Theoretical Guarantees" by Jiaha…☆14Sep 22, 2024Updated last year