maidacundo / MoE-LoRA
View external linksLinks

Adapt an LLM model to a Mixture-of-Experts model using Parameter Efficient finetuning (LoRA), injecting the LoRAs in the FFN.
84Oct 21, 2025Updated 3 months ago

Alternatives and similar repositories for MoE-LoRA

Users that are interested in MoE-LoRA are comparing it to the libraries listed below

Sorting:

Are these results useful?