Boyiliee / MoEx
MoEx (Moment Exchange)
☆141Updated 3 years ago
Alternatives and similar repositories for MoEx:
Users that are interested in MoEx are comparing it to the libraries listed below
- Un-Mix: Rethinking Image Mixtures for Unsupervised Visual Representation Learning.☆151Updated 2 years ago
- Pytorch implementation for "Open Compound Domain Adaptation" (CVPR 2020 ORAL)☆139Updated 3 years ago
- Code for Paper ''Dual Student: Breaking the Limits of the Teacher in Semi-Supervised Learning'' [ICCV 2019]☆118Updated 4 years ago
- Auto-Encoding Transformations (AETv1), CVPR 2019☆108Updated 5 years ago
- Implementation of momentum^2 teacher☆121Updated 4 years ago
- [ICCV 2019 oral] Code for Semi-Supervised Learning by Augmented Distribution Alignment☆62Updated 3 years ago
- ( NeurIPS 2020 ) Adversarial Style Mining for One-Shot Unsupervised Domain Adaptation☆71Updated 3 years ago
- A ShuffleBatchNorm layer to shuffle BatchNorm statistics across multiple GPUs☆56Updated 3 years ago
- Self-supervised Label Augmentation via Input Transformations (ICML 2020)☆105Updated 4 years ago
- EnAET: Self-Trained Ensemble AutoEncoding Transformations for Semi-Supervised Learning☆81Updated last year
- Code for "Domain Adaptation for Semantic Segmentation with Maximum Squares Loss" in PyTorch.