alexrs / herd

Mixture of Expert (MoE) techniques for enhancing LLM performance through expert-driven prompt mapping and adapter combinations.
12Updated last year

Alternatives and similar repositories for herd:

Users that are interested in herd are comparing it to the libraries listed below