facebookresearch / mawsLinks
Code and models for the paper "The effectiveness of MAE pre-pretraining for billion-scale pretraining" https://arxiv.org/abs/2303.13496
☆92Updated 5 months ago
Alternatives and similar repositories for maws
Users that are interested in maws are comparing it to the libraries listed below
Sorting:
- Code for experiments for "ConvNet vs Transformer, Supervised vs CLIP: Beyond ImageNet Accuracy"☆101Updated last year
- [ICML 2024] This repository includes the official implementation of our paper "Rejuvenating image-GPT as Strong Visual Representation Lea…☆98Updated last year
- Test-Time Training on Video Streams☆64Updated 2 years ago
- PyTorch implementation of R-MAE https//arxiv.org/abs/2306.05411☆114Updated 2 years ago
- [NeurIPS 2024] Official implementation of the paper "Interfacing Foundation Models' Embeddings"☆125Updated last year
- [WACV2025 Oral] DeepMIM: Deep Supervision for Masked Image Modeling☆53Updated 4 months ago
- Code base of SynthCLIP: CLIP training with purely synthetic text-image pairs from LLMs and TTIs.☆100Updated 6 months ago
- code release of research paper "Exploring Long-Sequence Masked Autoencoders"☆100Updated 2 years ago
- Official repository of paper "Subobject-level Image Tokenization" (ICML-25)☆87Updated 3 months ago
- ☆53Updated 2 years ago
- This is a offical PyTorch/GPU implementation of SupMAE.☆78Updated 3 years ago
- A Contrastive Learning Boost from Intermediate Pre-Trained Representations☆43Updated last year
- [CVPR24] Official Implementation of GEM (Grounding Everything Module)