dkozlov / awesome-knowledge-distillationLinks
Awesome Knowledge Distillation
☆3,679Updated last week
Alternatives and similar repositories for awesome-knowledge-distillation
Users that are interested in awesome-knowledge-distillation are comparing it to the libraries listed below
Sorting:
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,591Updated 2 years ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,939Updated 2 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,694Updated 3 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,336Updated last year
- knowledge distillation papers☆755Updated 2 years ago
- A curated list of neural network pruning resources.☆2,449Updated last year
- Count the MACs / FLOPs of your PyTorch model.☆4,993Updated 10 months ago
- Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distille…☆4,394Updated 2 years ago
- Improving Convolutional Networks via Attention Transfer (ICLR 2017)☆1,455Updated 6 years ago
- Model analyzer in PyTorch☆1,481Updated 2 years ago
- PyTorch implementation of Contrastive Learning methods☆1,983Updated last year
- Automated Deep Learning: Neural Architecture Search Is Not the End (a curated list of AutoDL resources and an in-depth analysis)☆2,288Updated 2 years ago
- Rethinking the Value of Network Pruning (Pytorch) (ICLR 2019)☆1,514Updated 4 years ago
- PyTorch implementation of MoCo: https://arxiv.org/abs/1911.05722☆4,981Updated last month
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆25 knowledge distillation methods p…☆1,507Updated 2 weeks ago
- A curated list of awesome self-supervised methods☆6,280Updated 10 months ago
- Flops counter for neural networks in pytorch framework☆2,896Updated 4 months ago
- Collection of recent methods on (deep) neural network compression and acceleration.☆949Updated last month
- label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful☆2,241Updated 7 months ago
- Differentiable architecture search for convolutional and recurrent networks☆3,962Updated 4 years ago
- A quickstart and benchmark for pytorch distributed training.☆1,666Updated 10 months ago
- Awesome Incremental Learning☆4,075Updated last month
- A curated list of resources for Learning with Noisy Labels☆2,681Updated 3 weeks ago
- Codebase for Image Classification Research, written in PyTorch.