sharc-lab / Edge-MoE

Edge-MoE: Memory-Efficient Multi-Task Vision Transformer Architecture with Task-level Sparsity via Mixture-of-Experts
82Updated 4 months ago

Related projects: