FORA introduces simple yet effective caching mechanism in Diffusion Transformer Architecture for faster inference sampling.
☆55Jul 8, 2024Updated last year
Alternatives and similar repositories for FORA
Users that are interested in FORA are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- ☆191Jan 14, 2025Updated last year
- [NeurIPS 2024] Learning-to-Cache: Accelerating Diffusion Transformer via Layer Caching☆119Jul 15, 2024Updated last year
- [ICCV2025] From Reusing to Forecasting: Accelerating Diffusion Models with TaylorSeers☆383Mar 2, 2026Updated last month
- ☆49Mar 3, 2024Updated 2 years ago
- [ICLR2025] Accelerating Diffusion Transformers with Token-wise Feature Caching☆215Mar 14, 2025Updated last year
- Managed Database hosting by DigitalOcean • AdPostgreSQL, MySQL, MongoDB, Kafka, Valkey, and OpenSearch available. Automatically scale up storage and focus on building your apps.
- 📚 Collection of awesome generation acceleration resources.