Soft Mixture of Experts Vision Transformer, addressing MoE limitations as highlighted by Puigcerver et al., 2023.
☆16Aug 13, 2023Updated 2 years ago
Alternatives and similar repositories for SoftMoE
Users that are interested in SoftMoE are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Sample code for the paper "VLM-driven Behavior Tree for Context-aware Task Planning”☆18Jan 10, 2025Updated last year
- ☆13Feb 21, 2024Updated 2 years ago
- Official Code of SATS: Self-Attention Transfer for Continual Semantic Segmentation☆25Feb 23, 2023Updated 3 years ago
- ☆16Nov 15, 2024Updated last year