karanchahal / distillerView external linksLinks
A large scale study of Knowledge Distillation.
☆220Apr 19, 2020Updated 5 years ago
Alternatives and similar repositories for distiller
Users that are interested in distiller are comparing it to the libraries listed below
Sorting:
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,742Nov 25, 2021Updated 4 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆414May 17, 2021Updated 4 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,654May 30, 2023Updated 2 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆264Oct 3, 2019Updated 6 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆423Jun 23, 2020Updated 5 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,426Oct 16, 2023Updated 2 years ago
- Awesome Knowledge Distillation☆3,811Dec 25, 2025Updated last month
- Dynamic Distribution Pruning for Efficient Network Architecture Search☆47Jun 24, 2019Updated 6 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Nov 21, 2019Updated 6 years ago
- Zero-Shot Knowledge Distillation in Deep Networks in ICML2019☆49Jun 20, 2019Updated 6 years ago
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quan…☆653Mar 1, 2023Updated 2 years ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,978Mar 25, 2023Updated 2 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆25May 25, 2020Updated 5 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆585Feb 15, 2023Updated 3 years ago
- Single shot neural network pruning before training the model, based on connection sensitivity☆11Aug 7, 2019Updated 6 years ago
- Role-Wise Data Augmentation for Knowledge Distillation☆19Nov 22, 2022Updated 3 years ago
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods p…☆1,590Dec 24, 2025Updated last month
- ☆61Apr 24, 2020Updated 5 years ago
- (IJCAI 2019) Knowledge Amalgamation from Heterogeneous Networks by Common Feature Learning☆10Nov 25, 2022Updated 3 years ago
- Zero-Shot Knowledge Distillation in Deep Networks☆67Apr 16, 2022Updated 3 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆106Sep 9, 2019Updated 6 years ago
- All about acceleration and compression of Deep Neural Networks☆33Nov 5, 2019Updated 6 years ago
- Codes for DATA: Differentiable ArchiTecture Approximation.☆11Jul 22, 2021Updated 4 years ago
- Searching a High Performance Feature Extractor for Text Recognition Network. TPAMI 2022☆13Nov 25, 2022Updated 3 years ago
- Universal Python binding for the LMDB 'Lightning' Database☆13Nov 7, 2017Updated 8 years ago
- MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning. In ICCV 2019.☆352Jul 5, 2020Updated 5 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- My best practice of training large dataset using PyTorch.☆1,104May 9, 2024Updated last year
- Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"☆336Jul 25, 2024Updated last year
- Codes for accepted paper "Cooperative Pruning in Cross-Domain Deep Neural Network Compression" in IJCAI 2019.☆12Aug 15, 2019Updated 6 years ago
- Graph Knowledge Distillation☆13Mar 6, 2020Updated 5 years ago
- Implementation of semi-supervised learning using PyTorch Lightning☆14Jul 25, 2024Updated last year
- Semi-supervised Adaptive Distillation is a model compression method for object detection.☆59Oct 9, 2019Updated 6 years ago
- Implementation of CVPR 2019 paper: Distilling Object Detectors with Fine-grained Feature Imitation☆420Jul 15, 2021Updated 4 years ago
- PyTorch implementation of Towards Efficient Training for Neural Network Quantization☆16Jan 16, 2020Updated 6 years ago
- Gold Loss Correction for training neural networks with labels corrupted with severe noise☆13Aug 17, 2019Updated 6 years ago
- Notes and tutorials on "Mutual information and self-supervised learning"☆39Nov 1, 2019Updated 6 years ago
- Code for Active Mixup in 2020 CVPR☆23Jan 11, 2022Updated 4 years ago
- ☆23Oct 27, 2019Updated 6 years ago