kaung-htet-myat / Multi-teachers-Knowledge-Distillation
Distilling knowledge from ensemble of multiple teacher networks to student network with multiple heads
☆7Updated 3 years ago
Alternatives and similar repositories for Multi-teachers-Knowledge-Distillation:
Users that are interested in Multi-teachers-Knowledge-Distillation are comparing it to the libraries listed below
- Code release for "Dynamic Domain Adaptation for Efficient Inference" (CVPR 2021)☆33Updated 3 years ago
- ☆23Updated 3 years ago
- Code for Active Mixup in 2020 CVPR☆22Updated 3 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- ☆20Updated 4 years ago
- NeurIPS 2022: Estimating Noise Transition Matrix with Label Correlations for Noisy Multi-Label Learning☆17Updated 2 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆25Updated 4 years ago
- Official PyTorch code for CVPR 2021 paper "AutoDO: Robust AutoAugment for Biased Data with Label Noise via Scalable Probabilistic Implici…☆24Updated 2 years ago