[AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".
☆76Jul 6, 2023Updated 2 years ago
Alternatives and similar repositories for OKDDip
Users that are interested in OKDDip are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆50Dec 13, 2022Updated 3 years ago
- ☆61Apr 24, 2020Updated 5 years ago
- Pytorch reproduction of Peer Collaborative Learning for Online Knowledge Distillation, AAAI2021☆21May 28, 2022Updated 3 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆31Nov 11, 2021Updated 4 years ago
- An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.☆167Oct 22, 2020Updated 5 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting with the flexibility to host WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Cloudways by DigitalOcean.
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆78Jul 29, 2024Updated last year
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Aug 19, 2020Updated 5 years ago
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆27Jul 14, 2023Updated 2 years ago
- ☆128Nov 2, 2020Updated 5 years ago
- Official implementation for (Refine Myself by Teaching Myself : Feature Refinement via Self-Knowledge Distillation, CVPR-2021)☆103Apr 30, 2024Updated last year
- ☆27Feb 6, 2021Updated 5 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Dec 15, 2022Updated 3 years ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆182Jan 29, 2022Updated 4 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting with the flexibility to host WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Cloudways by DigitalOcean.
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Jun 18, 2020Updated 5 years ago
- ☆17Nov 7, 2024Updated last year
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,657May 30, 2023Updated 2 years ago
- TensorFlow Implementation of Deep Mutual Learning☆325Apr 10, 2018Updated 8 years ago
- MarginDistillation: distillation for margin-based softmax☆44Sep 28, 2020Updated 5 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆25May 25, 2020Updated 5 years ago
- A pytorch implement of scalable neural netowrks.☆23Jun 9, 2020Updated 5 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆71Sep 9, 2019Updated 6 years ago
- Graph Knowledge Distillation☆13Mar 6, 2020Updated 6 years ago
- Bare Metal GPUs on DigitalOcean Gradient AI • AdPurpose-built for serious AI teams training foundational models, running large-scale inference, and pushing the boundaries of what's possible.
- Switchable Online Knowledge Distillation☆19Oct 27, 2024Updated last year
- Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019☆415May 17, 2021Updated 4 years ago
- Code for ICCV 2021 paper "Distilling Holistic Knowledge with Graph Neural Networks"☆44Dec 14, 2021Updated 4 years ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Jun 9, 2021Updated 4 years ago
- This is the official repository for Batch Level Distillation (BLD)☆15Jan 25, 2021Updated 5 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Feb 15, 2023Updated 3 years ago
- ☆23Oct 27, 2019Updated 6 years ago
- Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)☆423Jun 23, 2020Updated 5 years ago
- A PyTorch toolbox for domain adaptation, domain generalization, federated learning DA/DG, active learning DA/DG, ALDG and semi-supervised…☆11Jan 10, 2022Updated 4 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click and start building anything your business needs.
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆103Jun 16, 2022Updated 3 years ago
- [IJCV 2022] Domain-Specific Bias Filtering for Single Labeled Domain Generalization☆12Nov 10, 2022Updated 3 years ago
- ☆19Jun 26, 2021Updated 4 years ago
- Self-supervised Label Augmentation via Input Transformations (ICML 2020)☆107Nov 28, 2020Updated 5 years ago
- FedMD: Heterogenous Federated Learning via Model Distillation☆20Nov 1, 2019Updated 6 years ago
- This is the code of CVPR'20 paper "Distilling Cross-Task Knowledge via Relationship Matching".☆48Feb 7, 2021Updated 5 years ago
- This is the implementation for the ICASSP-2022 paper (Confidence-Aware Multi-Teacher Knowledge Distillation).☆63Feb 12, 2022Updated 4 years ago