yoshitomo-matsubara / torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. π25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. π Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
β1,499Updated this week
Alternatives and similar repositories for torchdistill:
Users that are interested in torchdistill are comparing it to the libraries listed below
- Pytorch implementation of various Knowledge Distillation (KD) methods.β1,686Updated 3 years ago
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quanβ¦β625Updated 2 years ago
- Awesome Knowledge-Distillation. εη±»ζ΄ηηη₯θ―θΈι¦paper(2014-2021)γβ2,580Updated last year
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillβ¦β853Updated last year
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methodsβ2,325Updated last year
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibilityβ1,932Updated 2 years ago
- This is a collection of our NAS and Vision Transformer work.