This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).
☆85Mar 19, 2025Updated last year
Alternatives and similar repositories for Knowledge-Distillation-Paper
Users that are interested in Knowledge-Distillation-Paper are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- A list of papers, docs, codes about diffusion distillation.This repo collects various distillation methods for the Diffusion model. Welc…☆40Dec 10, 2023Updated 2 years ago
- Multi-Teacher Knowledge Distillation, code for my PhD dissertation. I used knowledge distillation as a decision-fusion and compressing m…☆28May 19, 2023Updated 2 years ago
- Reading Papers☆14Mar 26, 2021Updated 5 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,662May 30, 2023Updated 2 years ago
- GPU virtual machines on DigitalOcean Gradient AI • AdGet to production fast with high-performance AMD and NVIDIA GPUs you can spin up in seconds. The definition of operational simplicity.
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆180Dec 3, 2024Updated last year
- ☆17Aug 1, 2025Updated 9 months ago
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆71Sep 23, 2024Updated last year
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆103Jun 16, 2022Updated 3 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Jul 5, 2023Updated 2 years ago
- Awesome Knowledge-Distillation for CV☆94Apr 30, 2024Updated 2 years ago
- Less is More: Task-aware Layer-wise Distillation for Language Model Compression (ICML2023)☆40Aug 28, 2023Updated 2 years ago
- A pytorch implementation of the ICCV2021 workshop paper SimDis: Simple Distillation Baselines for Improving Small Self-supervised Models☆14Jul 15, 2021Updated 4 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- Data-free knowledge distillation using Gaussian noise (NeurIPS paper)☆15Mar 24, 2023Updated 3 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆76Jul 6, 2023Updated 2 years ago
- [NeurIPS'22] What Makes a "Good" Data Augmentation in Knowledge Distillation -- A Statistical Perspective☆37Dec 15, 2022Updated 3 years ago
- ☆14Apr 11, 2024Updated 2 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,749Nov 25, 2021Updated 4 years ago
- Awesome Knowledge Distillation☆3,850Mar 22, 2026Updated last month
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,430Oct 16, 2023Updated 2 years ago
- PyTorch implementation for Channel Distillation☆103Jun 9, 2020Updated 5 years ago
- A benchmark suite for Scalable Diverse Model Selection for Accessible Transfer Learning from our NeurIPS 2021 paper.☆15Dec 14, 2022Updated 3 years ago
- GPUs on demand by Runpod - Special Offer Available • AdRun AI, ML, and HPC workloads on powerful cloud GPUs—without limits or wasted spend. Deploy GPUs in under a minute and pay by the second.
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆899Nov 5, 2023Updated 2 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated 2 years ago
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quan…☆650Mar 1, 2023Updated 3 years ago
- Official Repository for Can Language Models be Instructed to Protect Personal Information?☆13Oct 8, 2023Updated 2 years ago
- CVPR 2023, Class Attention Transfer Based Knowledge Distillation☆46Jun 13, 2023Updated 2 years ago
- This is the code of CVPR'20 paper "Distilling Cross-Task Knowledge via Relationship Matching".☆48Feb 7, 2021Updated 5 years ago
- Code for the Paper 'On the Connection Between Adversarial Robustness and Saliency Map Interpretability' by C. Etmann, S. Lunz, P. Maass, …☆16May 9, 2019Updated 6 years ago
- [AAAI 2024] XKD: Cross-modal Knowledge Distillation with Domain Alignment for Video Representation Learning.☆16Jul 9, 2024Updated last year
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- Wordpress hosting with auto-scaling - Free Trial Offer • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆180Jan 29, 2022Updated 4 years ago
- Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression☆19Oct 12, 2021Updated 4 years ago
- Qimera: Data-free Quantization with Synthetic Boundary Supporting Samples [NeurIPS 2021]☆34Dec 12, 2021Updated 4 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆15Oct 22, 2020Updated 5 years ago
- ☆19Jun 26, 2021Updated 4 years ago
- ☆13Apr 9, 2021Updated 5 years ago
- ☆11Oct 24, 2024Updated last year