sseung0703 / Variational_Information_Distillation
Reproducing VID in CVPR2019 (on working)
☆20Updated 5 years ago
Alternatives and similar repositories for Variational_Information_Distillation
Users that are interested in Variational_Information_Distillation are comparing it to the libraries listed below
Sorting:
- Regularizing Class-wise Predictions via Self-knowledge Distillation (CVPR 2020)☆107Updated 4 years ago
- ☆61Updated 5 years ago
- PyTorch implementations of "Unsupervised Semantic Aggregation and Deformable Template Matching for Semi-Supervised Learning" (NeurIPS2020…☆31Updated 4 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆29Updated 4 years ago
- Implementation of the Heterogeneous Knowledge Distillation using Information Flow Modeling method☆24Updated 4 years ago
- [AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".☆74Updated last year
- Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification☆82Updated 3 years ago
- The implementation of AAAI 2021 Paper: "Progressive Network Grafting for Few-Shot Knowledge Distillation".☆32Updated 9 months ago
- Feature Fusion for Online Mutual Knowledge Distillation Code☆26Updated 4 years ago
- [CVPR 2021] Adaptive Consistency Regularization for Semi-Supervised Transfer Learning☆104Updated 3 years ago
- PyTorch implementation of Self-supervised Contrastive Regularization for DG (SelfReg) [ICCV2021]☆77Updated 3 years ago
- (NeurIPS 2020 Workshop on SSL) Official Implementation of "MixCo: Mix-up Contrastive Learning for Visual Representation"☆58Updated 2 years ago
- ☆15Updated 3 years ago
- ☆26Updated 4 years ago
- Self-supervised Label Augmentation via Input Transformations (ICML 2020)☆106Updated 4 years ago
- Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)☆71Updated 5 years ago
- ☆50Updated 5 years ago
- NeurIPS 2021, "Fine Samples for Learning with Noisy Labels"☆39Updated 3 years ago
- A simple reimplement Online Knowledge Distillation via Collaborative Learning with pytorch☆48Updated 2 years ago
- SKD : Self-supervised Knowledge Distillation for Few-shot Learning☆97Updated last year
- Pytorch Implementation of Domain Generalization Using a Mixture of Multiple Latent Domains☆102Updated 3 years ago
- This is the repo for the paper "Episodic Training for Domain Generalization" https://arxiv.org/abs/1902.00113☆57Updated last year
- [ECCV 2020] Learning from Extrinsic and Intrinsic Supervisions for Domain Generalization☆48Updated 2 years ago
- Reproduce DLOW: Domain Flow for Adaptation and Generalization☆20Updated 5 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆16Updated 4 years ago
- Code release for Mind the Gap: Enlarging the Domain Gap in Open Set Domain Adaptation (TCSVT 2023)☆22Updated last year
- (Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)☆14Updated 4 years ago
- Code for the paper "M2m: Imbalanced Classification via Major-to-minor Translation" (CVPR 2020)☆96Updated 3 years ago
- Code release for "Transferable Normalization: Towards Improving Transferability of Deep Neural Networks" (NeurIPS 2019)☆79Updated 4 years ago
- [CVPR 2020] Rethinking Class-Balanced Methods for Long-Tailed Visual Recognition from a Domain Adaptation Perspective☆24Updated 4 years ago