fkodom / soft-mixture-of-expertsLinks

PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)
β˜†73Updated last year

Alternatives and similar repositories for soft-mixture-of-experts

Users that are interested in soft-mixture-of-experts are comparing it to the libraries listed below

Sorting: