kaung-htet-myat / Multi-teachers-Knowledge-Distillation
Distilling knowledge from ensemble of multiple teacher networks to student network with multiple heads
☆7Updated 3 years ago
Alternatives and similar repositories for Multi-teachers-Knowledge-Distillation:
Users that are interested in Multi-teachers-Knowledge-Distillation are comparing it to the libraries listed below
- Code release for "Dynamic Domain Adaptation for Efficient Inference" (CVPR 2021)☆33Updated 3 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Updated 4 years ago
- PyTorch implementation for our paper EvidentialMix: Learning with Combined Open-set and Closed-set Noisy Labels☆28Updated 4 years ago
- [ECCV 2020] Learning from Extrinsic and Intrinsic Supervisions for Domain Generalization☆48Updated 2 years ago
- Pytorch implementation of "Hallucinating Agnostic Images to Generalize Across Domains"☆11Updated 5 years ago
- ☆23Updated 3 years ago
- Ranking-based-Instance-Selection☆32Updated 3 years ago
- This is an official implementation of our CVPR 2020 paper "Non-Local Neural Networks With Grouped Bilinear Attentional Transforms".☆12Updated 4 years ago
- ICML'20: SIGUA: Forgetting May Make Learning with Noisy Labels More Robust☆15Updated 4 years ago
- NeurIPS 2022: Estimating Noise Transition Matrix with Label Correlations for Noisy Multi-Label Learning☆17Updated 2 years ago
- [NeurIPS 2021] “Improving Contrastive Learning on Imbalanced Data via Open-World Sampling”, Ziyu Jiang, Tianlong Chen, Ting Chen, Zhangya…☆28Updated 3 years ago
- Code for Active Mixup in 2020 CVPR☆22Updated 3 years ago
- Source code of our submission (Rank 2) for Semi-Supervised Domain Adaptation task in VisDA-2019☆16Updated 5 years ago
- ☆20Updated 4 years ago
- ☆10Updated 5 years ago
- PyTorch Implementation of Temporal Output Discrepancy for Active Learning, ICCV 2021☆41Updated 2 years ago
- ☆10Updated 2 years ago
- Official PyTorch code for CVPR 2021 paper "AutoDO: Robust AutoAugment for Biased Data with Label Noise via Scalable Probabilistic Implici…☆24Updated 2 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆26Updated 4 years ago
- A Generic Multi-classifier Paradigm forIncremental Learning☆11Updated 4 years ago
- supervised and semi-supervised image classification with self-supervision (Keras)☆45Updated 4 years ago
- Graph Knowledge Distillation☆13Updated 5 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- Source code of our submission (Rank 1) for Multi-Source Domain Adaptation task in VisDA-2019☆51Updated 5 years ago
- ☆20Updated 2 years ago
- Official repository for Reliable Label Bootstrapping☆19Updated 2 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Updated 2 years ago
- Implementation and Benchmark Splits to study Out-of-Distribution Generalization in Deep Metric Learning.☆23Updated 3 years ago
- Official Code of Paper HoMM: Higher-order Moment Matching for Unsupervised Domain Adaptation (AAAI2020)☆44Updated 5 years ago
- (ICML 2021) Implementation for S2SD - Simultaneous Similarity-based Self-Distillation for Deep Metric Learning. Paper Link: https://arxiv…☆42Updated 4 years ago