kaung-htet-myat / Multi-teachers-Knowledge-DistillationLinks
Distilling knowledge from ensemble of multiple teacher networks to student network with multiple heads
☆8Updated 3 years ago
Alternatives and similar repositories for Multi-teachers-Knowledge-Distillation
Users that are interested in Multi-teachers-Knowledge-Distillation are comparing it to the libraries listed below
Sorting:
- Code release for "Dynamic Domain Adaptation for Efficient Inference" (CVPR 2021)☆33Updated 3 years ago
- ☆20Updated 4 years ago
- TF-FD☆20Updated 2 years ago
- ☆23Updated 4 years ago
- PyTorch Implementation of Temporal Output Discrepancy for Active Learning, ICCV 2021☆41Updated 2 years ago
- Code for CVPR2021 paper: MOOD: Multi-level Out-of-distribution Detection☆38Updated last year
- Code for Active Mixup in 2020 CVPR☆23Updated 3 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- ☆15Updated 4 years ago
- Distributed Network Architecture Search☆9Updated 5 years ago
- NeurIPS 2022: Estimating Noise Transition Matrix with Label Correlations for Noisy Multi-Label Learning☆17Updated 2 years ago
- ☆10Updated 5 years ago
- A Generic Multi-classifier Paradigm forIncremental Learning☆11Updated 4 years ago
- [ECCV 2020] Learning from Extrinsic and Intrinsic Supervisions for Domain Generalization☆48Updated 3 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆29Updated 4 years ago
- This is an official implementation of our CVPR 2020 paper "Non-Local Neural Networks With Grouped Bilinear Attentional Transforms".☆12Updated 4 years ago
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Updated 2 years ago
- Triplet Loss for Knowledge Distillation☆18Updated 2 years ago
- [TMLR] "Adversarial Feature Augmentation and Normalization for Visual Recognition", Tianlong Chen, Yu Cheng, Zhe Gan, Jianfeng Wang, Liju…☆20Updated 2 years ago
- Official PyTorch code for CVPR 2021 paper "AutoDO: Robust AutoAugment for Biased Data with Label Noise via Scalable Probabilistic Implici…☆24Updated 2 years ago
- Pytorch implementation of "Hallucinating Agnostic Images to Generalize Across Domains"☆11Updated 5 years ago
- ISD: Self-Supervised Learning by Iterative Similarity Distillation☆36Updated 3 years ago
- (ICML 2021) Implementation for S2SD - Simultaneous Similarity-based Self-Distillation for Deep Metric Learning. Paper Link: https://arxiv…☆43Updated 4 years ago
- AMTML-KD: Adaptive Multi-teacher Multi-level Knowledge Distillation☆59Updated 4 years ago
- supervised and semi-supervised image classification with self-supervision (Keras)☆45Updated 4 years ago
- IJCAI 2021, "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation"☆41Updated 2 years ago
- [NeurIPS 2021] “Improving Contrastive Learning on Imbalanced Data via Open-World Sampling”, Ziyu Jiang, Tianlong Chen, Ting Chen, Zhangya…☆28Updated 3 years ago
- CVPR2021☆12Updated 4 years ago
- Pytorch implementation (TPAMI 2023) - Training Compact CNNs for Image Classification using Dynamic-coded Filter Fusion☆19Updated 2 years ago
- Code for "Balanced Knowledge Distillation for Long-tailed Learning"☆27Updated last year