VICO-UoE / KD4MTL
Knowledge Distillation for Multi-task Learning (ECCV20 Workshops)
☆75Updated 2 years ago
Alternatives and similar repositories for KD4MTL
Users that are interested in KD4MTL are comparing it to the libraries listed below
Sorting:
- AdaShare: Learning What To Share For Efficient Deep Multi-Task Learning☆112Updated 4 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆107Updated 4 years ago
- PyTorch code for the ICCV'21 paper: "Always Be Dreaming: A New Approach for Class-Incremental Learning"☆63Updated 2 years ago
- ☆23Updated 4 years ago
- The official repository (in PyTorch) for the ECCV 2020 paper "Reparameterizing Convolutions for Incremental Multi-Task Learning without T…☆32Updated last year
- PyTorch implementation of the paper "Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels" in NIPS 2018☆128Updated 5 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆117Updated 4 years ago
- [CVPR 2021] Adaptive Consistency Regularization for Semi-Supervised Transfer Learning☆104Updated 3 years ago
- ResLT: Residual Learning for Long-tailed Recognition (TPAMI 2022)☆61Updated last year
- Improving Calibration for Long-Tailed Recognition (CVPR2021)☆148Updated 3 years ago
- Universal Representations: A Unified Look at Multiple Task and Domain Learning☆43Updated last year
- Code for CoMatch: Semi-supervised Learning with Contrastive Graph Regularization☆127Updated 2 weeks ago
- The official code for the paper "Delving Deep into Label Smoothing", IEEE TIP 2021☆80Updated 2 years ago
- ICLR 2021, "Learning with feature-dependent label noise: a progressive approach"☆43Updated 2 years ago
- The implementation of "Self-Supervised Generalisation with Meta Auxiliary Learning" [NeurIPS 2019].☆174Updated 3 years ago
- When Does Label Smoothing Help?_pytorch_implementationimp☆124Updated 5 years ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆176Updated 3 years ago
- Implementation of "Distribution Alignment: A Unified Framework for Long-tail Visual Recognition"(CVPR 2021)☆118Updated 3 years ago
- ☆170Updated 4 years ago
- [AAAI 2021] Curriculum Labeling: Revisiting Pseudo-Labeling for Semi-Supervised Learning☆139Updated 4 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆236Updated 2 years ago
- Official Pytorch implementation of MixMo framework☆84Updated 3 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆26Updated 4 years ago
- Generative Feature Replay For Class-Incremental Learning☆37Updated 5 years ago
- SKD : Self-supervised Knowledge Distillation for Few-shot Learning☆97Updated last year
- ☆46Updated 3 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated 9 months ago
- Official repository of the paper 'Essentials for Class Incremental Learning'☆40Updated 3 years ago
- Code for the paper "Incremental Learning Techniques for Semantic Segmentation", Michieli U. and Zanuttigh P., ICCVW, 2019☆59Updated 4 years ago
- Learning a Unified Classifier Incrementally via Rebalancing☆190Updated 4 years ago