facebookresearch / Mixture-of-Transformers
View external linksLinks

Mixture-of-Transformers: A Sparse and Scalable Architecture for Multi-Modal Foundation Models. TMLR 2025.
162Sep 13, 2025Updated 5 months ago

Alternatives and similar repositories for Mixture-of-Transformers

Users that are interested in Mixture-of-Transformers are comparing it to the libraries listed below

Sorting:

Are these results useful?