yoshitomo-matsubara / torchdistillLinks
A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. π26 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. π Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
β1,558Updated last month
Alternatives and similar repositories for torchdistill
Users that are interested in torchdistill are comparing it to the libraries listed below
Sorting:
- Pytorch implementation of various Knowledge Distillation (KD) methods.β1,717Updated 3 years ago
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quanβ¦β646Updated 2 years ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibilityβ1,966Updated 2 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methodsβ2,381Updated last year
- SAM: Sharpness-Aware Minimization (PyTorch)β1,921Updated last year
- This is a collection of our NAS and Vision Transformer work.β1,806Updated last year
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillβ¦β872Updated last year
- Awesome Knowledge-Distillation. εη±»ζ΄ηηη₯θ―θΈι¦paper(2014-2021)γβ2,627Updated 2 years ago
- A curated list of neural network pruning resources.