A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch
☆50Dec 13, 2022Updated 3 years ago
Alternatives and similar repositories for Online-Knowledge-Distillation-via-Collaborative-Learning
Users that are interested in Online-Knowledge-Distillation-via-Collaborative-Learning are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆76Jul 6, 2023Updated 2 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆31Nov 11, 2021Updated 4 years ago
- Pytorch reproduction of Peer Collaborative Learning for Online Knowledge Distillation, AAAI2021☆21May 28, 2022Updated 3 years ago
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆27Jul 14, 2023Updated 2 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆27Jul 21, 2020Updated 5 years ago
- Virtual machines for every use case on DigitalOcean • AdGet dependable uptime with 99.99% SLA, simple security tools, and predictable monthly pricing with DigitalOcean's virtual machines, called Droplets.
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated 2 years ago
- ☆128Nov 2, 2020Updated 5 years ago
- Official Pytorch Implementation of Unsupervised Representation Learning for Binary Networks by Joint Classifier Training (CVPR 2022)☆11Apr 10, 2022Updated 4 years ago
- ☆19Nov 11, 2019Updated 6 years ago
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆31Jul 5, 2023Updated 2 years ago
- Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626☆18Oct 7, 2020Updated 5 years ago
- The code for the Network Binarization via Contrastive Learning, which has been accepted to ECCV 2022.☆14Jul 13, 2022Updated 3 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- Codes for paper "Few Shot Network Compression via Cross Distillation", AAAI 2020.☆30Jan 31, 2020Updated 6 years ago
- AI Agents on DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- Distilling Object Detectors with Feature Richness☆43Apr 15, 2022Updated 4 years ago
- The official implementation of [ACMMM2022] Pay Attention to Your Positive Pairs: Positive Pair Aware Contrastive Knowledge Distillation☆11May 27, 2023Updated 2 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Dec 15, 2022Updated 3 years ago
- PyTorch Implementation of Matching Guided Distillation [ECCV'20]☆66Aug 7, 2021Updated 4 years ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆109Nov 28, 2022Updated 3 years ago
- ☆31Jun 18, 2020Updated 5 years ago
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 4 years ago
- A pytorch implementation of the "In Defense of the Triplet Loss for Person Re-Identification" paper (https://arxiv.org/abs/1703.07737). I…☆42Sep 26, 2018Updated 7 years ago
- Proton VPN Special Offer - Get 70% off • AdSpecial partner offer. Trusted by over 100 million users worldwide. Tested, Approved and Recommended by Experts.
- ☆13May 21, 2023Updated 2 years ago
- AutoDIAL Caffe Implementation☆27Jul 24, 2017Updated 8 years ago
- Graph Knowledge Distillation☆13Mar 6, 2020Updated 6 years ago
- Official toolkit for Multi-View Layout Estimation Challenge in OmniCV workshop at CVPR'23.☆16Jun 1, 2023Updated 2 years ago
- ☆37Feb 1, 2022Updated 4 years ago
- Unofficial pytorch implementation of Born-Again Neural Networks.☆57Mar 24, 2021Updated 5 years ago
- A reproduction of PRUNING FILTERS FOR EFFICIENT CONVNETS☆28Jun 15, 2020Updated 5 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆15Oct 22, 2020Updated 5 years ago
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,663May 30, 2023Updated 2 years ago
- GPUs on demand by Runpod - Special Offer Available • AdRun AI, ML, and HPC workloads on powerful cloud GPUs—without limits or wasted spend. Deploy GPUs in under a minute and pay by the second.
- Role-Wise Data Augmentation for Knowledge Distillation☆19Nov 22, 2022Updated 3 years ago
- ☆13Dec 11, 2021Updated 4 years ago
- Processing-in Memory Architecture for Multiply-Accumulate Operations with Hybrid Memory Cube☆12Feb 13, 2017Updated 9 years ago
- ☆37Jun 21, 2022Updated 3 years ago
- HVSMR 2016: MICCAI Workshop on Whole-Heart and Great Vessel Segmentation from 3D Cardiovascular MRI in Congenital Heart Disease☆12Mar 11, 2020Updated 6 years ago
- ☆12Mar 1, 2024Updated 2 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,431Oct 16, 2023Updated 2 years ago