SforAiDl / KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
β618Updated 2 years ago
Alternatives and similar repositories for KD_Lib:
Users that are interested in KD_Lib are comparing it to the libraries listed below
- A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. π25 knowledge distillation methods pβ¦β1,455Updated this week
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularizationβ584Updated 2 years ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.β1,662Updated 3 years ago
- A large scale study of Knowledge Distillation.β219Updated 4 years ago
- A general and accurate MACs / FLOPs profiler for PyTorch modelsβ599Updated 9 months ago
- π Toolbox to extend PyTorch functionalitiesβ419Updated 9 months ago
- Estimate/count FLOPS for a given neural network using pytorchβ304Updated 2 years ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibilityβ1,913Updated last year
- NFNets and Adaptive Gradient Clipping for SGD implemented in PyTorch. Find explanation at tourdeml.github.io/blog/β344Updated last year
- PyTorch library to facilitate development and standardized evaluation of neural network pruning methods.β428Updated last year
- knowledge distillation papersβ749Updated 2 years ago
- Unofficial PyTorch Reimplementation of RandAugment.β631Updated last year
- PyTorch layer-by-layer model profilerβ606Updated 3 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)β416Updated 4 years ago
- MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.β697Updated 3 years ago
- Pytorch implementation of the paper "Class-Balanced Loss Based on Effective Number of Samples"β791Updated last year
- AugMix: A Simple Data Processing Method to Improve Robustness and Uncertaintyβ983Updated 7 months ago
- Awesome machine learning model compression research papers, quantization, tools, and learning material.β504Updated 5 months ago
- Open-source code for paper "Dataset Distillation"β783Updated 2 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019β397Updated 3 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methodsβ2,272Updated last year
- Escaping the Big Data Paradigm with Compact Transformers, 2021 (Train your Vision Transformers in 30 mins on CIFAR-10 with a single GPU!)β515Updated 3 months ago
- [ICLR 2020] Once for All: Train One Network and Specialize it for Efficient Deploymentβ1,899Updated last year
- On-the-fly Structured Pruning for PyTorch models. This library implements several attributions metrics and structured pruning utils for nβ¦β164Updated 4 years ago
- A list of multi-task learning papers and projects.β369Updated 3 years ago
- β584Updated 4 months ago
- Gradually-Warmup Learning Rate Scheduler for PyTorchβ987Updated 4 months ago
- Awesome Knowledge-Distillation. εη±»ζ΄ηηη₯θ―θΈι¦paper(2014-2021)γβ2,549Updated last year
- Distilling Knowledge via Knowledge Review, CVPR 2021β266Updated 2 years ago
- My best practice of training large dataset using PyTorch.β1,092Updated 9 months ago