kyegomez / LIMoE

Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"
22Updated this week

Related projects

Alternatives and complementary repositories for LIMoE