fkodom / soft-mixture-of-experts

PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)
71Updated last year

Alternatives and similar repositories for soft-mixture-of-experts:

Users that are interested in soft-mixture-of-experts are comparing it to the libraries listed below