tatp22 / linformer-pytorchView external linksLinks
My take on a practical implementation of Linformer for Pytorch.
☆422Jul 27, 2022Updated 3 years ago
Alternatives and similar repositories for linformer-pytorch
Users that are interested in linformer-pytorch are comparing it to the libraries listed below
Sorting:
- Implementation of Linformer for Pytorch☆305Jan 5, 2024Updated 2 years ago
- Transformer based on a variant of attention that is linear complexity in respect to sequence length☆827May 5, 2024Updated last year
- Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention☆270Aug 10, 2021Updated 4 years ago
- Pytorch library for fast transformer implementations☆1,761Mar 23, 2023Updated 2 years ago
- An implementation of Performer, a linear attention-based transformer, in Pytorch☆1,172Feb 2, 2022Updated 4 years ago
- Reformer, the efficient Transformer, in Pytorch☆2,193Jun 21, 2023Updated 2 years ago
- Fully featured implementation of Routing Transformer☆300Nov 6, 2021Updated 4 years ago
- Reproducing the Linear Multihead Attention introduced in Linformer paper (Linformer: Self-Attention with Linear Complexity)☆75Jun 23, 2020Updated 5 years ago
- [ICLR 2020] Lite Transformer with Long-Short Range Attention☆611Jul 11, 2024Updated last year
- ☆221Jun 8, 2020Updated 5 years ago
- DeLighT: Very Deep and Light-Weight Transformers☆469Oct 16, 2020Updated 5 years ago
- Longformer: The Long-Document Transformer☆2,186Feb 8, 2023Updated 3 years ago
- Repository for NeurIPS 2020 Spotlight "AdaBelief Optimizer: Adapting stepsizes by the belief in observed gradients"☆1,068Aug 9, 2024Updated last year
- list of efficient attention modules☆1,022Aug 23, 2021Updated 4 years ago
- Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch☆76Dec 4, 2022Updated 3 years ago
- Implementation of Memformer, a Memory-augmented Transformer, in Pytorch☆126Nov 13, 2020Updated 5 years ago
- Understanding the Difficulty of Training Transformers☆332May 31, 2022Updated 3 years ago
- SentAugment is a data augmentation technique for NLP that retrieves similar sentences from a large bank of sentences. It can be used in c…☆359Feb 22, 2022Updated 3 years ago
- FastFormers - highly efficient transformer models for NLU☆709Mar 21, 2025Updated 10 months ago
- Hopfield Networks is All You Need☆1,897Apr 23, 2023Updated 2 years ago
- Implements Reformer: The Efficient Transformer in pytorch.☆86Mar 1, 2020Updated 5 years ago
- Long Range Arena for Benchmarking Efficient Transformers☆777Dec 16, 2023Updated 2 years ago
- An implementation of the efficient attention module.☆328Nov 30, 2020Updated 5 years ago
- ☆388Oct 18, 2023Updated 2 years ago
- Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"☆1,607Aug 12, 2020Updated 5 years ago
- Implementation of RealFormer using pytorch☆101Dec 27, 2020Updated 5 years ago
- Fast Block Sparse Matrices for Pytorch☆549Jan 21, 2021Updated 5 years ago
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.☆147Jul 26, 2021Updated 4 years ago
- PyTorch Codes for Haar Graph Pooling☆11Feb 16, 2023Updated 2 years ago
- ☆3,684Sep 21, 2022Updated 3 years ago
- DisCo Transformer for Non-autoregressive MT☆77Jul 28, 2022Updated 3 years ago
- PyTorch extensions for high performance and large scale training.☆3,397Apr 26, 2025Updated 9 months ago
- Transformers without Tears: Improving the Normalization of Self-Attention☆134May 29, 2024Updated last year
- torch-optimizer -- collection of optimizers for Pytorch☆3,161Mar 22, 2024Updated last year
- Repository for the paper "Optimal Subarchitecture Extraction for BERT"☆470Jun 22, 2022Updated 3 years ago
- Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch☆1,194Aug 22, 2023Updated 2 years ago
- The implementation of the papers on dual learning of natural language understanding and generation. (ACL2019,2020; Findings of EMNLP 2020…☆67Oct 13, 2020Updated 5 years ago
- Skyformer: Remodel Self-Attention with Gaussian Kernel and Nystr\"om Method (NeurIPS 2021)☆63Apr 19, 2022Updated 3 years ago
- Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch☆59Jan 13, 2021Updated 5 years ago