facebookresearch / Mixture-of-TransformersView on GitHub
Mixture-of-Transformers: A Sparse and Scalable Architecture for Multi-Modal Foundation Models. TMLR 2025.
181Sep 13, 2025Updated 6 months ago

Alternatives and similar repositories for Mixture-of-Transformers

Users that are interested in Mixture-of-Transformers are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?