fkodom / soft-mixture-of-expertsLinks

PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)
β˜†78Updated 2 years ago

Alternatives and similar repositories for soft-mixture-of-experts

Users that are interested in soft-mixture-of-experts are comparing it to the libraries listed below

Sorting: