PyTorch Implementation of Matching Guided Distillation [ECCV'20]
☆66Aug 7, 2021Updated 4 years ago
Alternatives and similar repositories for mgd
Users that are interested in mgd are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Offical Code for Paper "Exploring Inter-Channel Correlation for Diversity-preserved Knowledge Distillation"☆17Jan 19, 2022Updated 4 years ago
- PyTorch implementation for Channel Distillation☆103Jun 9, 2020Updated 5 years ago
- provide benchmarks for multiple QNNs☆11Nov 5, 2023Updated 2 years ago
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 4 years ago
- ☆26Nov 9, 2024Updated last year
- GPU virtual machines on DigitalOcean Gradient AI • AdGet to production fast with high-performance AMD and NVIDIA GPUs you can spin up in seconds. The definition of operational simplicity.
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆423Jun 23, 2020Updated 5 years ago
- Official Pytorch implementation for Distilling Image Classifiers in Object detection (NeurIPS2021)☆32Feb 10, 2022Updated 4 years ago
- Pilgrim Project: torch2trt, quick convert your pytorch model to TensorRT engine.☆19Oct 10, 2020Updated 5 years ago
- MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.☆701Dec 24, 2021Updated 4 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated 2 years ago
- MarginDistillation: distillation for margin-based softmax☆44Sep 28, 2020Updated 5 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Aug 19, 2020Updated 5 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- ☆10May 9, 2019Updated 6 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- ☆11Apr 3, 2023Updated 3 years ago
- Implementation of the paper: Selective_Backpropagation from paper Accelerating Deep Learning by Focusing on the Biggest Losers☆15Feb 2, 2020Updated 6 years ago
- Learning Compatible Embeddings, ICCV 2021☆33Aug 18, 2021Updated 4 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆50Dec 13, 2022Updated 3 years ago
- ☆49Jul 28, 2020Updated 5 years ago
- ☆10Jul 5, 2019Updated 6 years ago
- Channel-wise Distillation for Semantic Segmentation☆78Nov 26, 2020Updated 5 years ago
- [ICML 2023] MTPD: Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation☆16Sep 12, 2023Updated 2 years ago
- Wordpress hosting with auto-scaling - Free Trial Offer • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Deep Structured Instance Graph for Distilling Object Detectors (ICCV 2021)☆35Oct 17, 2021Updated 4 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆583Feb 15, 2023Updated 3 years ago
- Official Implementation of MEAL: Multi-Model Ensemble via Adversarial Learning on AAAI 2019☆177Feb 20, 2020Updated 6 years ago
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆27Jul 14, 2023Updated 2 years ago
- [ICML2024] DetKDS: Knowledge Distillation Search for Object Detectors☆19Jul 11, 2024Updated last year
- WIDER FACE annotations converted to the Pascal VOC XML format☆16Jun 14, 2019Updated 6 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- ☆13Jun 16, 2024Updated last year
- ☆21Jul 6, 2022Updated 3 years ago
- AI Agents on DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,431Oct 16, 2023Updated 2 years ago
- MaskRCNN with Knowledge Distillation☆21Nov 6, 2020Updated 5 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆264Oct 3, 2019Updated 6 years ago
- Self-Distribution BNN☆10Mar 8, 2022Updated 4 years ago
- The official code for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL) and extension to other ta…☆741Apr 20, 2020Updated 6 years ago
- Distilling Knowledge via Knowledge Review, CVPR 2021☆277Dec 16, 2022Updated 3 years ago
- ☆12Jun 28, 2021Updated 4 years ago