feifeibear / long-context-attention

USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
424Updated this week

Alternatives and similar repositories for long-context-attention:

Users that are interested in long-context-attention are comparing it to the libraries listed below