Kennethborup / knowledgeDistillation
PyTorch implementation of (Hinton) Knowledge Distillation and a base class for simple implementation of other distillation methods.
☆27Updated 3 years ago
Related projects ⓘ
Alternatives and complementary repositories for knowledgeDistillation
- PyTorch, PyTorch Lightning framework for trying knowledge distillation in image classification problems☆31Updated 3 months ago
- IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"☆39Updated last year
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆72Updated 2 months ago
- ZSKD with PyTorch☆30Updated last year
- code for the ddp tutorial☆31Updated 2 years ago
- Code for Active Mixup in 2020 CVPR☆22Updated 2 years ago
- A regularized self-labeling approach to improve the generalization and robustness of fine-tuned models☆27Updated 2 years ago
- Stochastic Weight Averaging Tutorials using pytorch.☆33Updated 4 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 2 years ago
- Code for the PAPA paper☆27Updated 2 years ago
- AMTML-KD: Adaptive Multi-teacher Multi-level Knowledge Distillation☆48Updated 3 years ago
- Data-free knowledge distillation using Gaussian noise (NeurIPS paper)☆15Updated last year
- Implementation of AAAI 2022 Paper: Go wider instead of deeper☆32Updated 2 years ago
- Adversarial examples to the new ConvNeXt architecture☆20Updated 2 years ago
- ☆55Updated 3 years ago
- This repo is for our paper: Normalization Techniques in Training DNNs: Methodology, Analysis and Application☆84Updated 3 years ago
- A generic code base for neural network pruning, especially for pruning at initialization.☆30Updated 2 years ago
- Benchmark for federated noisy label learning☆19Updated 2 months ago
- ScrollNet for Continual Learning☆11Updated last year
- Implementation of TableFormer, Robust Transformer Modeling for Table-Text Encoding, in Pytorch☆35Updated 2 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆41Updated 2 years ago
- ☆19Updated 3 years ago
- Demonstration of transfer of knowledge and generalization with distillation☆45Updated 5 years ago
- ☆26Updated last year
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆90Updated 2 years ago
- several types of attention modules written in PyTorch☆39Updated last month
- Structured Pruning Adapters in PyTorch☆15Updated last year
- Github for the conference paper GLOD-Gaussian Likelihood OOD detector☆16Updated 2 years ago
- Code for 'Multi-level Logit Distillation' (CVPR2023)☆54Updated last month
- Offical Repo for Firefly Neural Architecture Descent: a General Approach for Growing Neural Networks. Accepted by Neurips 2020.☆29Updated 4 years ago