feifeibear / long-context-attentionLinks

USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
519Updated 3 weeks ago

Alternatives and similar repositories for long-context-attention

Users that are interested in long-context-attention are comparing it to the libraries listed below

Sorting: