A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch
☆50Dec 13, 2022Updated 3 years ago
Alternatives and similar repositories for Online-Knowledge-Distillation-via-Collaborative-Learning
Users that are interested in Online-Knowledge-Distillation-via-Collaborative-Learning are comparing it to the libraries listed below
Sorting:
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆31Nov 11, 2021Updated 4 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆76Jul 6, 2023Updated 2 years ago
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆26Jul 14, 2023Updated 2 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- Pytorch reproduction of Peer Collaborative Learning for Online Knowledge Distillation, AAAI2021☆21May 28, 2022Updated 3 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆25May 25, 2020Updated 5 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆27Jul 21, 2020Updated 5 years ago
- ☆61Apr 24, 2020Updated 5 years ago
- ☆16Feb 3, 2020Updated 6 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- ☆128Nov 2, 2020Updated 5 years ago
- Official Pytorch Implementation of Unsupervised Representation Learning for Binary Networks by Joint Classifier Training (CVPR 2022)☆11Apr 10, 2022Updated 3 years ago
- The official implementation of [ACMMM2022] Pay Attention to Your Positive Pairs: Positive Pair Aware Contrastive Knowledge Distillation☆11May 27, 2023Updated 2 years ago
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆30Jul 5, 2023Updated 2 years ago
- Graph Knowledge Distillation☆13Mar 6, 2020Updated 5 years ago
- Codes for paper "Few Shot Network Compression via Cross Distillation", AAAI 2020.☆31Jan 31, 2020Updated 6 years ago
- ☆31Jun 18, 2020Updated 5 years ago
- Data-free knowledge distillation using Gaussian noise (NeurIPS paper)☆15Mar 24, 2023Updated 2 years ago
- Online Collaborative Topic Regression☆14Mar 11, 2018Updated 7 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆16Oct 22, 2020Updated 5 years ago
- The code for the Network Binarization via Contrastive Learning, which has been accepted to ECCV 2022.☆14Jul 13, 2022Updated 3 years ago
- ☆19Nov 11, 2019Updated 6 years ago
- Topology Distillation for Recommender System (KDD'21)☆13Sep 2, 2021Updated 4 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- Triplet Loss for Knowledge Distillation☆18Sep 4, 2022Updated 3 years ago
- pytorch implementation for paper, towards realistic predictors☆17Sep 26, 2018Updated 7 years ago
- A pytorch implementation of the "In Defense of the Triplet Loss for Person Re-Identification" paper (https://arxiv.org/abs/1703.07737). I…☆42Sep 26, 2018Updated 7 years ago
- Role-Wise Data Augmentation for Knowledge Distillation☆19Nov 22, 2022Updated 3 years ago
- PyTorch Implementation of Matching Guided Distillation [ECCV'20]☆67Aug 7, 2021Updated 4 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- Squeeze and Excitation network implementation.☆18May 26, 2019Updated 6 years ago
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 3 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Dec 15, 2022Updated 3 years ago
- QCRAFT AutoScheduler: a library that allows users to automatically schedule the execution of their own quantum circuits, improving effici…☆18Oct 28, 2025Updated 4 months ago
- Implementations of knowledge distillation and knowledge transfer models in neural networks.☆23May 12, 2019Updated 6 years ago
- [ICASSP-2021] Official implementations of Multi-View Contrastive Learning for Online Knowledge Distillation (MCL-OKD)☆27Apr 7, 2021Updated 4 years ago
- TF-FD☆20Nov 19, 2022Updated 3 years ago
- Unofficial pytorch implementation of Born-Again Neural Networks.☆57Mar 24, 2021Updated 4 years ago
- A pytorch implement of scalable neural netowrks.☆23Jun 9, 2020Updated 5 years ago