facebookresearch / mawsLinks

Code and models for the paper "The effectiveness of MAE pre-pretraining for billion-scale pretraining" https://arxiv.org/abs/2303.13496
89Updated last month

Alternatives and similar repositories for maws

Users that are interested in maws are comparing it to the libraries listed below

Sorting: