pprp / Awesome-Efficient-MoE
☆16Updated this week
Related projects ⓘ
Alternatives and complementary repositories for Awesome-Efficient-MoE
- [ICML'24 Oral] APT: Adaptive Pruning and Tuning Pretrained Language Models for Efficient Training and Inference☆28Updated 5 months ago
- Adapting LLaMA Decoder to Vision Transformer☆27Updated 6 months ago
- [MM2024, oral] "Self-Supervised Visual Preference Alignment" https://arxiv.org/abs/2404.10501