This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).
☆82Mar 19, 2025Updated 11 months ago
Alternatives and similar repositories for Knowledge-Distillation-Paper
Users that are interested in Knowledge-Distillation-Paper are comparing it to the libraries listed below
Sorting:
- A list of papers, docs, codes about diffusion distillation.This repo collects various distillation methods for the Diffusion model. Welc…☆40Dec 10, 2023Updated 2 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- Reading Papers☆14Mar 26, 2021Updated 4 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆102Jun 16, 2022Updated 3 years ago
- This is the implementation for the ICME-2023 paper (Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning).☆34Apr 1, 2023Updated 2 years ago
- Multi-Teacher Knowledge Distillation, code for my PhD dissertation. I used knowledge distillation as a decision-fusion and compressing m…☆27May 19, 2023Updated 2 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,654May 30, 2023Updated 2 years ago
- Awesome Knowledge-Distillation for CV☆92Apr 30, 2024Updated last year
- This is the implementation for the ICASSP-2022 paper (Confidence-Aware Multi-Teacher Knowledge Distillation).☆63Feb 12, 2022Updated 4 years ago
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Jul 5, 2023Updated 2 years ago
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆71Sep 23, 2024Updated last year
- Gradients as Features for Deep Representation Learning☆43Mar 8, 2020Updated 5 years ago
- Forecasting, anomaly detection, predictive analytics, econometrics and much more☆16Jun 29, 2020Updated 5 years ago
- [IJCV 2022] Domain-Specific Bias Filtering for Single Labeled Domain Generalization☆12Nov 10, 2022Updated 3 years ago
- Source code to the paper "A Closest Point Proposal for MCMC-based Probabilistic Surface Registration"☆30Sep 24, 2021Updated 4 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- [NeurIPS'22] What Makes a "Good" Data Augmentation in Knowledge Distillation -- A Statistical Perspective☆37Dec 15, 2022Updated 3 years ago
- ☆14Apr 11, 2024Updated last year
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆76Jul 6, 2023Updated 2 years ago
- Official Implementation of "Style Generator Inversion for Image Enhancement and Animation".☆13Dec 2, 2021Updated 4 years ago
- Data-free knowledge distillation using Gaussian noise (NeurIPS paper)☆15Mar 24, 2023Updated 2 years ago
- A Fine-Grained House Music Dataset☆23Oct 28, 2022Updated 3 years ago
- Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression☆19Oct 12, 2021Updated 4 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆16Oct 22, 2020Updated 5 years ago
- Less is More: Task-aware Layer-wise Distillation for Language Model Compression (ICML2023)☆40Aug 28, 2023Updated 2 years ago
- knowledge distillation papers☆767Feb 10, 2023Updated 3 years ago
- ☆19Jun 26, 2021Updated 4 years ago
- A benchmark suite for Scalable Diverse Model Selection for Accessible Transfer Learning from our NeurIPS 2021 paper.☆15Dec 14, 2022Updated 3 years ago
- PyTorch implementation for Channel Distillation☆103Jun 9, 2020Updated 5 years ago
- Code for the Paper 'On the Connection Between Adversarial Robustness and Saliency Map Interpretability' by C. Etmann, S. Lunz, P. Maass, …☆16May 9, 2019Updated 6 years ago
- Implementation for Variational Information Bottleneck for Effective Low-resource Fine-tuning, ICLR 2021☆43May 10, 2021Updated 4 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆106Sep 9, 2019Updated 6 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,745Nov 25, 2021Updated 4 years ago
- [NeurIPS 2019] E2-Train: Training State-of-the-art CNNs with Over 80% Less Energy☆21Nov 18, 2019Updated 6 years ago
- Awesome Knowledge Distillation☆3,817Dec 25, 2025Updated 2 months ago
- A study of distance measures and learning methods for semi-supervised learning on time series data☆17Jun 23, 2021Updated 4 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆46Jun 13, 2023Updated 2 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,425Oct 16, 2023Updated 2 years ago