kyegomez / Infini-attention

Implementation of the paper: "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" from Google in pyTORCH
53Updated this week

Alternatives and similar repositories for Infini-attention:

Users that are interested in Infini-attention are comparing it to the libraries listed below