alexrs / herd

Mixture of Expert (MoE) techniques for enhancing LLM performance through expert-driven prompt mapping and adapter combinations.
12Updated 7 months ago

Related projects: