lhyfst / knowledge-distillation-papersLinks
knowledge distillation papers
☆762Updated 2 years ago
Alternatives and similar repositories for knowledge-distillation-papers
Users that are interested in knowledge-distillation-papers are comparing it to the libraries listed below
Sorting:
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,721Updated 3 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,631Updated 2 years ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,968Updated 2 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Updated 2 years ago
- ☆669Updated 4 years ago
- Collection of recent methods on (deep) neural network compression and acceleration.☆952Updated 6 months ago
- Awesome Knowledge Distillation☆3,751Updated 4 months ago
- Rethinking the Value of Network Pruning (Pytorch) (ICLR 2019)☆1,515Updated 5 years ago
- A curated list of long-tailed recognition resources.☆586Updated 2 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆404Updated 4 years ago
- This repository contains code for the paper "Decoupling Representation and Classifier for Long-Tailed Recognition", published at ICLR 202…☆976Updated 4 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆418Updated 5 years ago
- Papers for deep neural network compression and acceleration☆401Updated 4 years ago
- PyTorch DataLoaders implemented with DALI for accelerating image preprocessing☆885Updated 5 years ago
- A curated list of neural network pruning resources.☆2,478Updated last year
- A list of multi-task learning papers and projects.☆381Updated 3 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆874Updated last year
- My best practice of training large dataset using PyTorch.☆1,106Updated last year
- ☆196Updated last year
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆260Updated 6 years ago
- Summary, Code for Deep Neural Network Quantization☆554Updated 4 months ago
- Network Slimming (Pytorch) (ICCV 2017)☆915Updated 4 years ago
- The official PyTorch implementation of paper BBN: Bilateral-Branch Network with Cumulative Learning for Long-Tailed Visual Recognition☆669Updated 2 years ago
- A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures,…☆853Updated 4 years ago
- experiments on Paper <Bag of Tricks for Image Classification with Convolutional Neural Networks> and other useful tricks to improve CNN a…☆737Updated 6 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,382Updated 2 years ago
- mixup: Beyond Empirical Risk Minimization☆1,192Updated 4 years ago
- A quickstart and benchmark for pytorch distributed training.☆1,666Updated last year
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Updated 5 years ago
- Some tricks of pytorch...☆1,194Updated last year