kaung-htet-myat / Multi-teachers-Knowledge-Distillation
Distilling knowledge from ensemble of multiple teacher networks to student network with multiple heads
☆7Updated 3 years ago
Alternatives and similar repositories for Multi-teachers-Knowledge-Distillation:
Users that are interested in Multi-teachers-Knowledge-Distillation are comparing it to the libraries listed below
- Code release for "Dynamic Domain Adaptation for Efficient Inference" (CVPR 2021)☆33Updated 2 years ago
- This is an official implementation of our CVPR 2020 paper "Non-Local Neural Networks With Grouped Bilinear Attentional Transforms".☆12Updated 3 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Updated 4 years ago
- ICML'20: SIGUA: Forgetting May Make Learning with Noisy Labels More Robust☆13Updated 4 years ago
- A PyTorch implementation for Unsupervised Data Augmentation☆23Updated 2 years ago
- Official PyTorch code for CVPR 2021 paper "AutoDO: Robust AutoAugment for Biased Data with Label Noise via Scalable Probabilistic Implici…☆24Updated 2 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- Code for Active Mixup in 2020 CVPR☆22Updated 3 years ago
- Source code of our submission (Rank 1) for Multi-Source Domain Adaptation task in VisDA-2019☆51Updated 5 years ago
- ☆20Updated 4 years ago
- Code for the paper "Addressing Model Vulnerability to Distributional Shifts over Image Transformation Sets", ICCV 2019☆27Updated 4 years ago
- Ranking-based-Instance-Selection☆32Updated 3 years ago
- Pytorch implementation of "Hallucinating Agnostic Images to Generalize Across Domains"☆11Updated 5 years ago
- Graph Knowledge Distillation☆13Updated 4 years ago
- [ECCV 2020] Learning from Extrinsic and Intrinsic Supervisions for Domain Generalization☆48Updated 2 years ago
- PyTorch implementation for our paper EvidentialMix: Learning with Combined Open-set and Closed-set Noisy Labels☆28Updated 4 years ago
- ISD: Self-Supervised Learning by Iterative Similarity Distillation☆36Updated 3 years ago
- Source code of our submission (Rank 2) for Semi-Supervised Domain Adaptation task in VisDA-2019☆16Updated 5 years ago
- Official repository for Reliable Label Bootstrapping☆19Updated last year
- ☆22Updated 3 years ago
- Code for the ICML 2021 paper "Sharing Less is More: Lifelong Learning in Deep Networks with Selective Layer Transfer"☆11Updated 3 years ago
- [ICASSP 2020] Code release of paper 'Heterogeneous Domain Generalization via Domain Mixup'☆24Updated 4 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Updated 2 years ago
- Energy-based Out-of-distribution Detection☆15Updated 4 years ago
- Code for "Adversarial-Learned Loss for Domain Adaptation"(AAAI2020) in PyTorch.☆52Updated last year
- ☆19Updated last year
- Prior Knowledge Guided Unsupervised Domain Adaptation (ECCV 2022)☆16Updated 2 years ago
- Implementation of Learning to Combine: Knowledge Aggregation for Multi-Source Domain Adaptation (ECCV 2020).☆69Updated 2 years ago
- [NeurIPS 2021] “Improving Contrastive Learning on Imbalanced Data via Open-World Sampling”, Ziyu Jiang, Tianlong Chen, Ting Chen, Zhangya…☆28Updated 3 years ago