facebookresearch / Mixture-of-Transformers

Mixture-of-Transformers A Sparse and Scalable Architecture for Multi-Modal Foundation Models. TMLR 2025. πŸ”— https//arxiv.org/abs/2411.04996
β˜†31Updated this week

Alternatives and similar repositories for Mixture-of-Transformers

Users that are interested in Mixture-of-Transformers are comparing it to the libraries listed below

Sorting: