Teaches a student network from the knowledge obtained via training of a larger teacher network
☆160Mar 23, 2018Updated 8 years ago
Alternatives and similar repositories for Distilling-the-knowledge-in-neural-network
Users that are interested in Distilling-the-knowledge-in-neural-network are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression☆59Nov 18, 2017Updated 8 years ago
- Implementation of model compression with knowledge distilling method.☆342Jan 3, 2017Updated 9 years ago
- Demonstration of transfer of knowledge and generalization with distillation☆57Jan 15, 2019Updated 7 years ago
- Knowledge Distillation using Tensorflow☆140Aug 12, 2019Updated 6 years ago
- FitNets: Hints for Thin Deep Nets☆210May 14, 2015Updated 10 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- Awesome Knowledge Distillation☆3,858Mar 22, 2026Updated last month
- Code for SPIBB-DQN and Soft-SPIBB-DQN☆11May 5, 2020Updated 6 years ago
- Deep Neural Network Compression based on Student-Teacher Network☆14Jul 6, 2023Updated 2 years ago
- TensorFlow Implementation of Deep Mutual Learning☆325Apr 10, 2018Updated 8 years ago
- ☆23Nov 2, 2022Updated 3 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆106Sep 9, 2019Updated 6 years ago
- Reward Learning by Simulating the Past☆46May 9, 2019Updated 6 years ago
- Textboxes implementation with Tensorflow (python)☆20Mar 28, 2017Updated 9 years ago
- ☆17Oct 13, 2019Updated 6 years ago
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,985Mar 25, 2023Updated 3 years ago
- ☆23Sep 29, 2021Updated 4 years ago
- This repository stores the files used for my summer internship's work on "teacher-student learning", an experimental method for training …☆48Jan 26, 2019Updated 7 years ago
- Source code for 'Knowledge Distillation via Instance Relationship Graph'☆30Jun 13, 2019Updated 6 years ago
- Notes from Simons Institute program "Foundations of Machine Learning"☆13May 5, 2017Updated 9 years ago
- This code implements the code of the paper ,"Neighbor2Neighbor:Self-Supervised Denoising From Single Noisy Images",in2021☆16Feb 26, 2021Updated 5 years ago
- a pytorch implementation to fine-grained few shot classification using triplet loss☆11Feb 24, 2019Updated 7 years ago
- [ECCV 2020] Pytorch codes for Open-set Adversarial Defense☆21Mar 20, 2022Updated 4 years ago
- Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)☆265Nov 21, 2019Updated 6 years ago
- GPUs on demand by Runpod - Special Offer Available • AdRun AI, ML, and HPC workloads on powerful cloud GPUs—without limits or wasted spend. Deploy GPUs in under a minute and pay by the second.
- ☆26Nov 2, 2017Updated 8 years ago
- Implementation for paper "Few-Shot Learning with Global Class Representations" (https://arxiv.org/abs/1908.05257)☆18Oct 19, 2022Updated 3 years ago
- ☆18Jun 15, 2019Updated 6 years ago
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Feb 20, 2020Updated 6 years ago
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆418May 17, 2021Updated 4 years ago
- language models toolkits with hierarchical softmax setting☆17Mar 23, 2018Updated 8 years ago
- Implementation of ICLR 2017 paper "Loss-aware Binarization of Deep Networks"☆20Feb 24, 2019Updated 7 years ago
- Implementation of a Quantized Transformer Model☆20Mar 20, 2019Updated 7 years ago
- experimental quantized net implementation in chainer☆11Oct 1, 2016Updated 9 years ago
- Virtual machines for every use case on DigitalOcean • AdGet dependable uptime with 99.99% SLA, simple security tools, and predictable monthly pricing with DigitalOcean's virtual machines, called Droplets.
- Distilling BERT using natural language generation.☆39Aug 13, 2023Updated 2 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆264Oct 3, 2019Updated 6 years ago
- ☆25Feb 19, 2020Updated 6 years ago
- The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other ta…☆740Apr 20, 2020Updated 6 years ago
- ☆14Sep 25, 2016Updated 9 years ago
- The code base for the SCL implementation used in "Neural Structural Correspondence Learning for Domain Adaptation", CoNLL 2017 and in "Pi…☆22Jul 2, 2018Updated 7 years ago
- PyTorch implementation of Wide Residual Networks with 1-bit weights by McDonnell (ICLR 2018)☆126Sep 6, 2018Updated 7 years ago