lhyfst / knowledge-distillation-papers
knowledge distillation papers
☆752Updated 2 years ago
Alternatives and similar repositories for knowledge-distillation-papers:
Users that are interested in knowledge-distillation-papers are comparing it to the libraries listed below
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,559Updated last year
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,672Updated 3 years ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,918Updated last year
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆585Updated 2 years ago
- Awesome Knowledge Distillation☆3,611Updated last week
- ☆668Updated 3 years ago
- Collection of recent methods on (deep) neural network compression and acceleration.☆939Updated 3 months ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆395Updated 3 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆417Updated 4 years ago
- A curated list of neural network pruning resources.☆2,426Updated 11 months ago
- A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quan…☆619Updated 2 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,290Updated last year
- Papers for deep neural network compression and acceleration☆396Updated 3 years ago
- Rethinking the Value of Network Pruning (Pytorch) (ICLR 2019)☆1,516Updated 4 years ago
- The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distill…☆840Updated last year
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆257Updated 5 years ago
- Summary, Code for Deep Neural Network Quantization☆546Updated 5 months ago
- A curated list of long-tailed recognition resources.☆582Updated last year
- Open-source code for paper "Dataset Distillation"☆792Updated 2 years ago
- Slimmable Networks, AutoSlim, and Beyond, ICLR 2019, and ICCV 2019☆917Updated 2 years ago
- list of efficient attention modules☆995Updated 3 years ago
- My best practice of training large dataset using PyTorch.☆1,092Updated 10 months ago
- A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures,…☆852Updated 3 years ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆266Updated 2 years ago
- PyTorch implementation of Contrastive Learning methods☆1,966Updated last year
- This repository contains code for the paper "Decoupling Representation and Classifier for Long-Tailed Recognition", published at ICLR 202…☆960Updated 3 years ago
- Sublinear memory optimization for deep learning. https://arxiv.org/abs/1604.06174☆596Updated 5 years ago
- The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other ta…☆719Updated 4 years ago
- Some tricks of pytorch...☆1,181Updated 9 months ago
- experiments on Paper <Bag of Tricks for Image Classification with Convolutional Neural Networks> and other useful tricks to improve CNN a…☆724Updated 6 years ago