lucidrains / taylor-series-linear-attentionLinks
Explorations into the recently proposed Taylor Series Linear Attention
☆100Updated last year
Alternatives and similar repositories for taylor-series-linear-attention
Users that are interested in taylor-series-linear-attention are comparing it to the libraries listed below
Sorting:
- Some personal experiments around routing tokens to different autoregressive attention, akin to mixture-of-experts☆119Updated last year
- Implementation of Infini-Transformer in Pytorch☆113Updated 11 months ago
- Implementation of GateLoop Transformer in Pytorch and Jax☆91Updated last year
- Exploration into the proposed "Self Reasoning Tokens" by Felipe Bonetto