facebookresearch / maws

Code and models for the paper "The effectiveness of MAE pre-pretraining for billion-scale pretraining" https://arxiv.org/abs/2303.13496
80Updated 3 months ago

Related projects

Alternatives and complementary repositories for maws