DushyantaDhyani / kdtfLinks
Knowledge Distillation using Tensorflow
☆142Updated 6 years ago
Alternatives and similar repositories for kdtf
Users that are interested in kdtf are comparing it to the libraries listed below
Sorting:
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Updated 5 years ago
- A machine learning experiment☆180Updated 7 years ago
- Implementation of model compression with knowledge distilling method.☆342Updated 8 years ago
- Teaches a student network from the knowledge obtained via training of a larger teacher network☆158Updated 7 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆259Updated 5 years ago
- Random miniprojects with pytorch.☆170Updated 6 years ago
- Focal Loss of multi-classification in tensorflow☆81Updated 6 years ago
- A universal and efficient framework for training well-performing light net☆125Updated 8 years ago
- Network acceleration methods☆177Updated 4 years ago
- Cyclic learning rate TensorFlow implementation.☆66Updated 6 years ago
- The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API☆111Updated 3 years ago
- ☆169Updated 4 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆105Updated 5 years ago
- a list of awesome papers on deep model ompression and acceleration☆350Updated 4 years ago
- Corrupted labels and label smoothing☆129Updated 7 years ago
- The implementation of focal loss proposed on "Focal Loss for Dense Object Detection" by KM He and support for multi-label dataset.☆313Updated 7 years ago
- Keras + tensorflow experiments with knowledge distillation on EMNIST dataset☆34Updated 7 years ago
- Label Refinery: Improving ImageNet Classification through Label Progression☆278Updated 7 years ago
- Complementary code for the Targeted Dropout paper☆254Updated 5 years ago
- Code for the NuerIPS'19 paper "Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks"☆196Updated 5 years ago
- Tensorflow code for Differentiable architecture search☆72Updated 6 years ago
- PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression☆59Updated 7 years ago
- Tensorflow code for training different architectures(DenseNet, ResNet, AlexNet, GoogLeNet, VGG, NiN) on ImageNet dataset + Multi-GPU supp…☆169Updated 6 years ago
- Papers for deep neural network compression and acceleration☆400Updated 4 years ago
- my simple tutorial for mxnet, a fast deep learning framework☆103Updated 7 years ago
- A large scale study of Knowledge Distillation.☆220Updated 5 years ago
- Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"☆336Updated last year
- Converting a pretrained pytorch model to tensorflow☆94Updated 8 years ago
- FitNets: Hints for Thin Deep Nets☆208Updated 10 years ago
- Implementation and experiments for AdamW on Pytorch☆94Updated 5 years ago