Tammytcl / Awesome-Diffusion-Acceleration-CacheLinks
A curated list of research papers, resources, and advancements on Diffusion Cache and related efficient diffusion model acceleration techniques.
☆41Updated last month
Alternatives and similar repositories for Awesome-Diffusion-Acceleration-Cache
Users that are interested in Awesome-Diffusion-Acceleration-Cache are comparing it to the libraries listed below
Sorting:
- [NeurIPS 2025] ScaleKV: Memory-Efficient Visual Autoregressive Modeling with Scale-Aware KV Cache Compression☆49Updated 4 months ago
- FORA introduces simple yet effective caching mechanism in Diffusion Transformer Architecture for faster inference sampling.☆49Updated last year
- [NeurIPS 2024] Learning-to-Cache: Accelerating Diffusion Transformer via Layer Caching☆116Updated last year
- (NeurIPS 2024) BiDM: Pushing the Limit of Quantization for Diffusion Models