fkodom / soft-mixture-of-experts
View external linksLinks

PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)
82Oct 5, 2023Updated 2 years ago

Alternatives and similar repositories for soft-mixture-of-experts

Users that are interested in soft-mixture-of-experts are comparing it to the libraries listed below

Sorting:

Are these results useful?