Kennethborup / knowledgeDistillation
PyTorch implementation of (Hinton) Knowledge Distillation and a base class for simple implementation of other distillation methods.
☆28Updated 3 years ago
Related projects ⓘ
Alternatives and complementary repositories for knowledgeDistillation
- PyTorch, PyTorch Lightning framework for trying knowledge distillation in image classification problems☆32Updated 3 months ago
- Demonstration of transfer of knowledge and generalization with distillation☆45Updated 5 years ago
- This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).☆72Updated 2 months ago
- ZSKD with PyTorch☆30Updated last year
- IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"☆39Updated last year
- A regularized self-labeling approach to improve the generalization and robustness of fine-tuned models☆27Updated 2 years ago
- Code for Active Mixup in 2020 CVPR☆22Updated 2 years ago
- Data-free knowledge distillation using Gaussian noise (NeurIPS paper)☆15Updated last year
- sharpDARTS: Faster and More Accurate Differentiable Architecture Search☆16Updated 3 years ago
- [ACL 2023] Code for paper “Tailoring Instructions to Student’s Learning Levels Boosts Knowledge Distillation”(https://arxiv.org/abs/2305.…☆37Updated last year
- AAAI 2021: Robustness of Accuracy Metric and its Inspirations in Learning with Noisy Labels☆23Updated 3 years ago
- PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)☆66Updated last year
- Official PyTorch implementation of “Flexible Dataset Distillation: Learn Labels Instead of Images”☆41Updated 4 years ago
- label smoothing PyTorch implementation☆30Updated 4 years ago
- A Python Package for Deep Imbalanced Learning☆54Updated last year
- Recycling diverse models☆44Updated last year
- ☆12Updated 2 years ago
- PyTorch implementation of "Distilling the Knowledge in a Neural Network"☆67Updated 2 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆115Updated 3 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- Stochastic Weight Averaging Tutorials using pytorch.☆33Updated 4 years ago
- Implementation of Online Label Smoothing in PyTorch☆94Updated 2 years ago
- Implementation of TableFormer, Robust Transformer Modeling for Table-Text Encoding, in Pytorch☆36Updated 2 years ago
- A generic code base for neural network pruning, especially for pruning at initialization.☆30Updated 2 years ago
- AMTML-KD: Adaptive Multi-teacher Multi-level Knowledge Distillation☆51Updated 3 years ago
- ☆19Updated 2 years ago
- ☆13Updated 3 years ago
- Code for paper: “What Data Benefits My Classifier?” Enhancing Model Performance and Interpretability through Influence-Based Data Selecti…☆22Updated 6 months ago
- Structured Pruning Adapters in PyTorch☆15Updated last year
- Benchmark for federated noisy label learning☆19Updated 2 months ago