haitongli / knowledge-distillation-pytorch
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
☆1,933Updated 2 years ago
Alternatives and similar repositories for knowledge-distillation-pytorch
Users that are interested in knowledge-distillation-pytorch are comparing it to the libraries listed below
Sorting:
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,690Updated 3 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,584Updated last year
- Awesome Knowledge Distillation☆3,668Updated 2 months ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,327Updated last year
- knowledge distillation papers☆753Updated 2 years ago
- Rethinking the Value of Network Pruning (Pytorch) (ICLR 2019)☆1,514Updated 4 years ago
- A curated list of neural network pruning resources.☆2,440Updated last year
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆25 knowledge distillation methods p…☆1,499Updated this week
- Classification with PyTorch.☆1,726Updated 10 months ago
- mixup: Beyond Empirical Risk Minimization☆1,178Updated 3 years ago
- label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful☆2,241Updated 6 months ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆586Updated 2 years ago
- Collection of recent methods on (deep) neural network compression and acceleration.☆947Updated last month
- A quickstart and benchmark for pytorch distributed training.☆1,661Updated 9 months ago
- Flops counter for neural networks in pytorch framework☆2,891Updated 3 months ago
- Model analyzer in PyTorch☆1,482Updated 2 years ago
- Improving Convolutional Networks via Attention Transfer (ICLR 2017)☆1,455Updated 6 years ago
- My best practice of training large dataset using PyTorch.☆1,098Updated last year
- PyTorch implementation of Contrastive Learning methods☆1,979Updated last year
- Unofficial implementation of the ImageNet, CIFAR 10 and SVHN Augmentation Policies learned by AutoAugment using pillow☆1,479Updated 2 years ago
- Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distille…☆4,390Updated 2 years ago
- Count the MACs / FLOPs of your PyTorch model.☆4,985Updated 10 months ago
- PyTorch DataLoaders implemented with DALI for accelerating image preprocessing☆881Updated 4 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆853Updated last year
- Official Implementation of 'Fast AutoAugment' in PyTorch.☆1,600Updated 3 years ago
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quan…☆625Updated 2 years ago
- Gradually-Warmup Learning Rate Scheduler for PyTorch☆989Updated 7 months ago
- Network Slimming (Pytorch) (ICCV 2017)☆914Updated 4 years ago
- ☆668Updated 3 years ago
- Proper implementation of ResNet-s for CIFAR10/100 in pytorch that matches description of the original paper.☆1,275Updated 10 months ago