facebookresearch / Mixture-of-TransformersLinks

Mixture-of-Transformers: A Sparse and Scalable Architecture for Multi-Modal Foundation Models. TMLR 2025.
61Updated 3 weeks ago

Alternatives and similar repositories for Mixture-of-Transformers

Users that are interested in Mixture-of-Transformers are comparing it to the libraries listed below

Sorting: