cadurosar / graph_kd
Graph Knowledge Distillation
☆13Updated 5 years ago
Alternatives and similar repositories for graph_kd
Users that are interested in graph_kd are comparing it to the libraries listed below
Sorting:
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated last year
- ☆26Updated 4 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆29Updated 4 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆26Updated 4 years ago
- Implementation "Adapting Auxiliary Losses Using Gradient Similarity" article☆32Updated 6 years ago
- Source code for "Distilling Knowledge From Graph Convolutional Networks", CVPR'20☆57Updated 2 years ago
- NeurIPS 2019 : Learning to Propagate for Graph Meta-Learning☆36Updated 5 years ago
- Code for "Balanced Knowledge Distillation for Long-tailed Learning"☆27Updated last year
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆16Updated 4 years ago
- The implementation of AAAI 2021 Paper: "Progressive Network Grafting for Few-Shot Knowledge Distillation".☆32Updated 9 months ago
- ICML'20: SIGUA: Forgetting May Make Learning with Noisy Labels More Robust☆15Updated 4 years ago
- [ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Che…☆81Updated 3 years ago
- Source code for 'Knowledge Distillation via Instance Relationship Graph'☆30Updated 5 years ago
- Adjust Decision Boundary for Class Imbalanced Learning☆19Updated 4 years ago
- [CVPR 2020] Rethinking Class-Balanced Methods for Long-Tailed Visual Recognition from a Domain Adaptation Perspective☆24Updated 4 years ago
- Q. Yao, H. Yang, B. Han, G. Niu, J. Kwok. Searching to Exploit Memorization Effect in Learning from Noisy Labels. ICML 2020☆22Updated 4 years ago
- [ICLR 2021] Heteroskedastic and Imbalanced Deep Learning with Adaptive Regularization☆41Updated 4 years ago
- Code for ICCV 2021 paper "Distilling Holistic Knowledge with Graph Neural Networks"☆44Updated 3 years ago
- ☆61Updated 3 years ago
- ☆16Updated 4 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆24Updated 4 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆28Updated 3 years ago
- Continual learning using variational prototype replays☆9Updated 4 years ago
- ☆61Updated 5 years ago
- Code Release for Learning to Adapt to Evolving Domains☆31Updated 3 years ago
- A Generic Multi-classifier Paradigm forIncremental Learning☆11Updated 4 years ago
- Reproducing VID in CVPR2019 (on working)☆20Updated 5 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆48Updated 2 years ago
- [NeurIPS 2021] “Improving Contrastive Learning on Imbalanced Data via Open-World Sampling”, Ziyu Jiang, Tianlong Chen, Ting Chen, Zhangya…☆28Updated 3 years ago
- (NeurIPS 2019) Deep Model Transferbility from Attribution Maps☆20Updated 5 years ago