yoshitomo-matsubara / torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. π25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. π Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
β1,471Updated this week
Alternatives and similar repositories for torchdistill:
Users that are interested in torchdistill are comparing it to the libraries listed below
- Pytorch implementation of various Knowledge Distillation (KD) methods.β1,674Updated 3 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methodsβ2,295Updated last year
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quanβ¦β619Updated 2 years ago
- Awesome Knowledge-Distillation. εη±»ζ΄ηηη₯θ―θΈι¦paper(2014-2021)γβ2,563Updated last year
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillβ¦β840Updated last year
- Collection of common code that's shared among different research projects in FAIR computer vision team.β2,094Updated 3 months ago
- OpenMMLab Model Compression Toolbox and Benchmark.β1,564Updated 9 months ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibilityβ1,921Updated 2 years ago
- Official DeiT repositoryβ4,163Updated last year
- PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)β1,322Updated 9 months ago
- [ICLR 2020] Once for All: Train One Network and Specialize it for Efficient Deploymentβ1,903Updated last year
- A curated list of neural network pruning resources.