SforAiDl / KD_LibLinks
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
β646Updated 2 years ago
Alternatives and similar repositories for KD_Lib
Users that are interested in KD_Lib are comparing it to the libraries listed below
Sorting:
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. π26 knowledge distillation methods pβ¦β1,558Updated last month
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularizationβ584Updated 2 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.β1,717Updated 3 years ago
- PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.β431Updated 2 years ago
- NFNets and Adaptive Gradient Clipping for SGD implemented in PyTorch. Find explanation at tourdeml.github.io/blog/β349Updated last year
- A large scale study of Knowledge Distillation.β220Updated 5 years ago
- π Toolbox to extend PyTorch functionalitiesβ419Updated last year
- β605Updated last month
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibilityβ1,966Updated 2 years ago
- Escaping the Big Data Paradigm with Compact Transformers, 2021 (Train your Vision Transformers in 30 mins on CIFAR-10 with a single GPU!)β536Updated 11 months ago
- Open-source code for paper "Dataset Distillation"