kyegomez / LIMoE

Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"
25Updated this week

Alternatives and similar repositories for LIMoE:

Users that are interested in LIMoE are comparing it to the libraries listed below