VICO-UoE / KD4MTLLinks
Knowledge Distillation for Multi-task Learning (ECCV20 Workshops)
☆76Updated 2 years ago
Alternatives and similar repositories for KD4MTL
Users that are interested in KD4MTL are comparing it to the libraries listed below
Sorting:
- AdaShare: Learning What To Share For Efficient Deep Multi-Task Learning☆112Updated 4 years ago
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆109Updated 5 years ago
- Code for CoMatch: Semi-supervised Learning with Contrastive Graph Regularization☆128Updated 4 months ago
- Official Pytorch implementation of MixMo framework☆84Updated 4 years ago
- CrossNorm and SelfNorm for Generalization under Distribution Shifts, ICCV 2021☆128Updated 4 years ago
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Updated 2 years ago
- Improving Calibration for Long-Tailed Recognition (CVPR2021)☆149Updated 3 years ago
- PyTorch implementation of the paper "Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels" in NIPS 2018☆128Updated 5 years ago
- When Does Label Smoothing Help?_pytorch_implementationimp☆126Updated 5 years ago
- [AAAI 2021] Curriculum Labeling: Revisiting Pseudo-Labeling for Semi-Supervised Learning☆139Updated 4 years ago
- AMTML-KD: Adaptive Multi-teacher Multi-level Knowledge Distillation☆61Updated 4 years ago
- The official repository (in PyTorch) for the ECCV 2020 paper "Reparameterizing Convolutions for Incremental Multi-Task Learning without T…☆32Updated last year
- SKD : Self-supervised Knowledge Distillation for Few-shot Learning