chengzeyi / ParaAttention

Context parallel attention that accelerates DiT model inference with dynamic caching
165Updated this week

Alternatives and similar repositories for ParaAttention:

Users that are interested in ParaAttention are comparing it to the libraries listed below