list of efficient attention modules
☆1,023Aug 23, 2021Updated 4 years ago
Alternatives and similar repositories for awesome-fast-attention
Users that are interested in awesome-fast-attention are comparing it to the libraries listed below
Sorting:
- Pytorch library for fast transformer implementations☆1,763Mar 23, 2023Updated 2 years ago
- An implementation of the efficient attention module.☆329Nov 30, 2020Updated 5 years ago
- [ICLR 2020] Lite Transformer with Long-Short Range Attention☆610Jul 11, 2024Updated last year
- My take on a practical implementation of Linformer for Pytorch.☆421Jul 27, 2022Updated 3 years ago
- Worth-reading papers and related resources on attention mechanism, Transformer and pretrained language model (PLM) such as BERT. 值得一读的注意力…☆130Mar 27, 2021Updated 4 years ago
- Exploring Self-attention for Image Recognition, CVPR2020.☆752Jun 15, 2020Updated 5 years ago
- Official PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).☆228Apr 18, 2022Updated 3 years ago
- A curated list of Multimodal Related Research.☆1,389Aug 5, 2023Updated 2 years ago
- DeLighT: Very Deep and Light-Weight Transformers☆469Oct 16, 2020Updated 5 years ago
- Understanding the Difficulty of Training Transformers☆332May 31, 2022Updated 3 years ago
- Collect some papers about transformer with vision. Awesome Transformer with Computer Vision (CV)☆3,565Jan 7, 2025Updated last year
- PyTorch implementation of Contrastive Learning methods☆1,995Oct 4, 2023Updated 2 years ago
- Long Range Arena for Benchmarking Efficient Transformers☆782Dec 16, 2023Updated 2 years ago
- ResNeSt: Split-Attention Networks☆3,264Dec 9, 2022Updated 3 years ago
- Awesome Transformers (self-attention) in Computer Vision☆269Jul 31, 2021Updated 4 years ago
- torch-optimizer -- collection of optimizers for Pytorch☆3,163Mar 22, 2024Updated last year
- ☆280Mar 22, 2021Updated 4 years ago
- CCNet: Criss-Cross Attention for Semantic Segmentation (TPAMI 2020 & ICCV 2019).☆1,481Mar 19, 2021Updated 4 years ago
- 🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.…☆12,164Dec 6, 2024Updated last year
- ☆221Jun 8, 2020Updated 5 years ago
- A PyTorch implementation of the Transformer model in "Attention is All You Need".☆9,641Apr 16, 2024Updated last year
- Transformer training code for sequential tasks☆609Sep 14, 2021Updated 4 years ago
- Fast, differentiable sorting and ranking in PyTorch☆852Updated this week
- Count the MACs / FLOPs of your PyTorch model.☆5,081Jul 8, 2024Updated last year
- A curated list of awesome self-supervised methods☆6,363Feb 24, 2026Updated last week
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,654May 30, 2023Updated 2 years ago
- Paper bank for Self-Supervised Learning☆585Mar 14, 2023Updated 2 years ago
- A comprehensive list of awesome contrastive self-supervised learning papers.☆1,309Sep 10, 2024Updated last year
- Implementing Attention Augmented Convolutional Networks using Pytorch☆654Jan 26, 2022Updated 4 years ago
- SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners☆4,458May 22, 2023Updated 2 years ago
- GCNet: Non-local Networks Meet Squeeze-Excitation Networks and Beyond☆1,218Feb 16, 2021Updated 5 years ago
- Recent Transformer-based CV and related works.☆1,339Aug 22, 2023Updated 2 years ago
- A curated list of NLP resources focused on Transformer networks, attention mechanism, GPT, BERT, ChatGPT, LLMs, and transfer learning.☆1,130Oct 27, 2024Updated last year
- Longformer: The Long-Document Transformer☆2,189Feb 8, 2023Updated 3 years ago
- A curated list of pretrained sentence and word embedding models☆2,290Apr 23, 2021Updated 4 years ago
- Automated Deep Learning: Neural Architecture Search Is Not the End (a curated list of AutoDL resources and an in-depth analysis)☆2,336Sep 26, 2022Updated 3 years ago
- Dual Attention Network for Scene Segmentation (CVPR2019)☆2,456Dec 23, 2024Updated last year
- Recent Advances in Vision and Language PreTrained Models (VL-PTMs)☆1,155Aug 19, 2022Updated 3 years ago
- Single Headed Attention RNN - "Stop thinking with your head"☆1,180Nov 27, 2021Updated 4 years ago