sseung0703 / Variational_Information_DistillationLinks
Reproducing VID in CVPR2019 (on working)
☆20Updated 5 years ago
Alternatives and similar repositories for Variational_Information_Distillation
Users that are interested in Variational_Information_Distillation are comparing it to the libraries listed below
Sorting:
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆108Updated 5 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆29Updated 4 years ago
- [CVPR 2021] Adaptive Consistency Regularization for Semi-Supervised Transfer Learning☆105Updated 3 years ago
- Pytorch Implementation of Domain Generalization Using a Mixture of Multiple Latent Domains☆103Updated 3 years ago
- PyTorch implementations of "Unsupervised Semantic Aggregation and Deformable Template Matching for Semi-Supervised Learning" (NeurIPS2020…☆31Updated 4 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated 2 years ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆26Updated 4 years ago
- The implementation of AAAI 2021 Paper: "Progressive Network Grafting for Few-Shot Knowledge Distillation".☆32Updated 11 months ago
- ☆61Updated 5 years ago
- ☆127Updated 4 years ago
- A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', …☆179Updated 3 years ago
- Improving Calibration for Long-Tailed Recognition (CVPR2021)☆148Updated 3 years ago
- ☆34Updated last year
- When Does Label Smoothing Help?_pytorch_implementationimp☆125Updated 5 years ago
- Code release for NeurIPS 2020 paper "Co-Tuning for Transfer Learning"☆40Updated 3 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆117Updated 4 years ago
- SKD : Self-supervised Knowledge Distillation for Few-shot Learning☆98Updated last year
- [ECCV2020] Knowledge Distillation Meets Self-Supervision☆237Updated 2 years ago
- [AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".☆75Updated 11 months ago
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Updated 4 years ago
- A collection of awesome things about mixed sample data augmentation☆132Updated 5 years ago
- Code for ICCV2019 "Symmetric Cross Entropy for Robust Learning with Noisy Labels"☆170Updated 4 years ago
- The official code for the paper "Delving Deep into Label Smoothing", IEEE TIP 2021☆81Updated 3 years ago
- Learning with Noisy Labels via Sparse Regularization, ICCV2021☆46Updated 3 years ago
- NeurIPS 2021, "Fine Samples for Learning with Noisy Labels"☆39Updated 3 years ago
- Published in IEEE Transactions on Artificial Intelligence☆56Updated 3 years ago
- CrossNorm and SelfNorm for Generalization under Distribution Shifts, ICCV 2021☆129Updated 3 years ago
- Self-supervised Label Augmentation via Input Transformations (ICML 2020)☆105Updated 4 years ago
- Official Pytorch implementation of MixMo framework☆84Updated 3 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆49Updated 2 years ago