feifeibear / long-context-attentionView on GitHub
USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
644Jan 15, 2026Updated last month

Alternatives and similar repositories for long-context-attention

Users that are interested in long-context-attention are comparing it to the libraries listed below

Sorting:

Are these results useful?