facebookresearch / maws

Code and models for the paper "The effectiveness of MAE pre-pretraining for billion-scale pretraining" https://arxiv.org/abs/2303.13496
87Updated 7 months ago

Alternatives and similar repositories for maws:

Users that are interested in maws are comparing it to the libraries listed below