xyltt / Linear-Transformer
Transformer are RNNs: Fast Autoregressive Transformer with Linear Attention
☆22Updated 4 years ago
Alternatives and similar repositories for Linear-Transformer
Users that are interested in Linear-Transformer are comparing it to the libraries listed below
Sorting:
- ☆33Updated 4 years ago
- code for Explicit Sparse Transformer☆62Updated last year
- Mixture of Attention Heads☆44Updated 2 years ago
- [ICLR 2022] "Anti-Oversmoothing in Deep Vision Transformers via the Fourier Domain Analysis: From Theory to Practice" by Peihao Wang, Wen…☆79Updated last year
- Implementation of AAAI 2022 Paper: Go wider instead of deeper☆32Updated 2 years ago
- PyTorch implementation of Pay Attention to MLPs☆40Updated 3 years ago
- [ICLR 2022] Official implementation of cosformer-attention in cosFormer: Rethinking Softmax in Attention