rishikksh20 / ResMLP-pytorch
ResMLP: Feedforward networks for image classification with data-efficient training
☆42Updated 3 years ago
Alternatives and similar repositories for ResMLP-pytorch:
Users that are interested in ResMLP-pytorch are comparing it to the libraries listed below
- This is the implementation of our CVPR'23 paper "Class-Conditional Sharpness-Aware Minimization for Deep Long-Tailed Recognition".☆17Updated last year
- Official implementation for "Pure Noise to the Rescue of Insufficient Data: Improving Imbalanced Classification by Training on Random Noi…☆15Updated 2 years ago
- ☆58Updated 2 years ago
- Implementation of HAT https://arxiv.org/pdf/2204.00993☆48Updated 10 months ago
- ☆27Updated 2 years ago
- LoMaR (Efficient Self-supervised Vision Pretraining with Local Masked Reconstruction)☆62Updated 2 years ago
- [CVPR 2023 Highlight] Masked Image Modeling with Local Multi-Scale Reconstruction☆46Updated last year
- Implementation of Hire-MLP: Vision MLP via Hierarchical Rearrangement and An Image Patch is a Wave: Phase-Aware Vision MLP.☆34Updated 2 years ago
- ☆57Updated last year
- [CVPR 2022] "The Principle of Diversity: Training Stronger Vision Transformers Calls for Reducing All Levels of Redundancy" by Tianlong C…☆25Updated 2 years ago
- ☆25Updated last year
- Official repository for the paper "Salient Mask-Guided Vision Transformer for Fine-Grained Classification" (VISIGRAPP '23)☆18Updated last year
- ☆41Updated 2 years ago
- [CVPR2022] Official Implementation of the paper 'Learning Where to Learn in Cross-View Self-Supervised Learning'☆27Updated 2 years ago
- [ECCV 2022] Implementation of the paper "Locality Guidance for Improving Vision Transformers on Tiny Datasets"☆77Updated 2 years ago
- Prior Knowledge Guided Unsupervised Domain Adaptation (ECCV 2022)☆16Updated 2 years ago
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆27Updated last year
- [CVPR 2023] This repository includes the official implementation our paper "Masked Autoencoders Enable Efficient Knowledge Distillers"