Jangho-Kim / FFL-pytorchLinks
Feature Fusion for Online Mutual Knowledge Distillation Code
☆26Updated 5 years ago
Alternatives and similar repositories for FFL-pytorch
Users that are interested in FFL-pytorch are comparing it to the libraries listed below
Sorting:
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆29Updated 5 years ago
- Code for "Balanced Knowledge Distillation for Long-tailed Learning"☆27Updated last year
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆108Updated 5 years ago
- ☆127Updated 4 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated 2 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆49Updated 2 years ago
- NeurIPS 2021, "Fine Samples for Learning with Noisy Labels"☆39Updated 3 years ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆179Updated 3 years ago
- Repo for CReST: A Class-Rebalancing Self-Training Framework for Imbalanced Semi-Supervised Learning☆101Updated last year
- [TPAMI-2023] Official implementations of L-MCL: Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition☆25Updated 2 years ago
- Reproducing VID in CVPR2019 (on working)☆20Updated 5 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆41Updated 2 years ago
- [CVPR 2020] Rethinking Class-Balanced Methods for Long-Tailed Visual Recognition from a Domain Adaptation Perspective☆24Updated 5 years ago
- ICLR 2021, "Learning with feature-dependent label noise: a progressive approach"☆43Updated 2 years ago
- Code for CoMatch: Semi-supervised Learning with Contrastive Graph Regularization☆128Updated 3 months ago
- Code for "Multi-Task Curriculum Framework for Open-Set Semi-Supervised Learning"☆24Updated 5 years ago
- [ICASSP 2020] Code release of paper 'Heterogeneous Domain Generalization via Domain Mixup'☆26Updated 5 years ago
- Code for "Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification", ECCV 2020 Spotlight☆38Updated 4 years ago
- Improving Calibration for Long-Tailed Recognition (CVPR2021)☆148Updated 3 years ago
- ☆61Updated 3 years ago
- ☆61Updated 5 years ago
- SKD : Self-supervised Knowledge Distillation for Few-shot Learning☆99Updated last year
- PyTorch implementation of the paper "SuperLoss: A Generic Loss for Robust Curriculum Learning" in NIPS 2020.☆29Updated 4 years ago
- ☆27Updated 4 years ago
- Code release for NeurIPS 2020 paper "Co-Tuning for Transfer Learning"☆39Updated 3 years ago
- [ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of…☆109Updated 2 years ago
- [AAAI 2021] Curriculum Labeling: Revisiting Pseudo-Labeling for Semi-Supervised Learning☆139Updated 4 years ago
- AMTML-KD: Adaptive Multi-teacher Multi-level Knowledge Distillation☆61Updated 4 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated last year
- An unofficial implementation of 《Deep Mutual Learning》 by Pytorch to do classification on cifar100.☆166Updated 4 years ago