Implementation of model compression with knowledge distilling method.
☆342Jan 3, 2017Updated 9 years ago
Alternatives and similar repositories for model_compression
Users that are interested in model_compression are comparing it to the libraries listed below
Sorting:
- This is my final year project of Bachelor of Engineering. Its still incomplete though. I am trying to replicate the research paper "Deep …☆77Sep 21, 2017Updated 8 years ago
- Implementation of Data-free Knowledge Distillation for Deep Neural Networks (on arxiv!)☆81Feb 10, 2018Updated 8 years ago
- ☆137Oct 22, 2018Updated 7 years ago
- Channel Pruning for Accelerating Very Deep Neural Networks (ICCV'17)☆1,088May 2, 2024Updated last year
- Awesome Knowledge Distillation☆3,827Dec 25, 2025Updated 2 months ago
- Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"☆336Jul 25, 2024Updated last year
- FitNets: Hints for Thin Deep Nets☆211May 14, 2015Updated 10 years ago
- Knowledge Distillation using Tensorflow☆140Aug 12, 2019Updated 6 years ago
- residual-SqueezeNet☆155Mar 15, 2019Updated 7 years ago
- Caffe implementation for dynamic network surgery.☆189Aug 15, 2017Updated 8 years ago
- ShuffleNet-V2 for both PyTorch and Caffe.☆505Aug 9, 2018Updated 7 years ago
- Teaches a student network from the knowledge obtained via training of a larger teacher network☆160Mar 23, 2018Updated 7 years ago
- PyTorch Implementation of [1611.06440] Pruning Convolutional Neural Networks for Resource Efficient Inference☆887Jul 12, 2019Updated 6 years ago
- KnowledgeDistillation Layer (Caffe implementation)☆89Jun 8, 2017Updated 8 years ago
- Improving Convolutional Networks via Attention Transfer (ICLR 2017)☆1,466Jul 11, 2018Updated 7 years ago
- Deep Face Model Compression☆195Aug 21, 2018Updated 7 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Nov 21, 2019Updated 6 years ago
- ☆48Oct 9, 2022Updated 3 years ago
- Caffe for Sparse and Low-rank Deep Neural Networks☆382Mar 8, 2020Updated 6 years ago
- Implementation of CVPR 2019 paper: Distilling Object Detectors with Fine-grained Feature Imitation☆421Jul 15, 2021Updated 4 years ago
- Learning Efficient Convolutional Networks through Network Slimming, In ICCV 2017.☆576Jul 14, 2019Updated 6 years ago
- a list of awesome papers on deep model ompression and acceleration☆349Jun 19, 2021Updated 4 years ago
- Code and Pretrained model for IGCV3☆189Oct 22, 2018Updated 7 years ago
- Deep Compression on AlexNet☆672Mar 5, 2022Updated 4 years ago
- Transfer knowledge from a large DNN or an ensemble of DNNs into a small DNN☆29May 8, 2017Updated 8 years ago
- An efficient framework for convolutional neural networks☆279Aug 30, 2023Updated 2 years ago
- A machine learning experiment☆180Oct 20, 2017Updated 8 years ago
- This is a fast caffe implementation of ShuffleNet.☆453Aug 30, 2018Updated 7 years ago
- Implementation of "Iterative pruning" on TensorFlow☆160Apr 15, 2021Updated 4 years ago
- ☆45Apr 14, 2017Updated 8 years ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,983Mar 25, 2023Updated 2 years ago
- Label Refinery: Improving ImageNet Classification through Label Progression☆279May 26, 2018Updated 7 years ago
- Caffe Implementation for Incremental network quantization☆191Jul 29, 2018Updated 7 years ago
- Implementation of Ternary Weight Networks In Caffe☆63Nov 29, 2016Updated 9 years ago
- Caffe Implementation of Google's MobileNets (v1 and v2)☆1,274Jun 8, 2021Updated 4 years ago
- Rethinking the Value of Network Pruning (Pytorch) (ICLR 2019)☆1,516Jun 7, 2020Updated 5 years ago
- Training Deep Neural Networks with binary weights during propagations☆382Feb 15, 2016Updated 10 years ago
- RON: Reverse Connection with Objectness Prior Networks for Object Detection, CVPR 2017☆352Mar 22, 2018Updated 7 years ago
- caffe model of ICCV'17 paper - ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression https://arxiv.org/abs/1707.06342☆148Sep 19, 2018Updated 7 years ago