lucidrains / reformer-pytorchView external linksLinks
Reformer, the efficient Transformer, in Pytorch
☆2,193Jun 21, 2023Updated 2 years ago
Alternatives and similar repositories for reformer-pytorch
Users that are interested in reformer-pytorch are comparing it to the libraries listed below
Sorting:
- An implementation of Performer, a linear attention-based transformer, in Pytorch☆1,172Feb 2, 2022Updated 4 years ago
- Pytorch library for fast transformer implementations☆1,761Mar 23, 2023Updated 2 years ago
- Longformer: The Long-Document Transformer☆2,186Feb 8, 2023Updated 3 years ago
- Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention☆270Aug 10, 2021Updated 4 years ago
- My take on a practical implementation of Linformer for Pytorch.☆422Jul 27, 2022Updated 3 years ago
- [ICLR 2020] Lite Transformer with Long-Short Range Attention☆611Jul 11, 2024Updated last year
- Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch☆5,629Feb 17, 2024Updated last year
- ☆221Jun 8, 2020Updated 5 years ago
- Transformer training code for sequential tasks☆610Sep 14, 2021Updated 4 years ago
- ☆3,685Sep 21, 2022Updated 3 years ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,491Jan 14, 2026Updated last month
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆32,153Sep 30, 2025Updated 4 months ago
- Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"☆1,608Aug 12, 2020Updated 5 years ago
- Fast, general, and tested differentiable structured prediction in PyTorch☆1,123Apr 20, 2022Updated 3 years ago
- Trax — Deep Learning with Clear Code and Speed☆8,305Sep 26, 2025Updated 4 months ago
- Implementation of Linformer for Pytorch☆305Jan 5, 2024Updated 2 years ago
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆9,395Jan 26, 2026Updated 3 weeks ago
- FastFormers - highly efficient transformer models for NLU☆709Mar 21, 2025Updated 10 months ago
- XLNet: Generalized Autoregressive Pretraining for Language Understanding☆6,177May 28, 2023Updated 2 years ago
- ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators☆2,371Mar 23, 2024Updated last year
- torch-optimizer -- collection of optimizers for Pytorch☆3,162Mar 22, 2024Updated last year
- Implements Reformer: The Efficient Transformer in pytorch.☆86Mar 1, 2020Updated 5 years ago
- A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch☆8,918Feb 9, 2026Updated last week
- PyTorch extensions for high performance and large scale training.☆3,397Apr 26, 2025Updated 9 months ago
- Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.☆30,823Feb 4, 2026Updated last week
- Official DeiT repository☆4,322Mar 15, 2024Updated last year
- Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch☆879Oct 30, 2023Updated 2 years ago
- Understanding the Difficulty of Training Transformers☆332May 31, 2022Updated 3 years ago
- Single Headed Attention RNN - "Stop thinking with your head"☆1,180Nov 27, 2021Updated 4 years ago
- PyTorch original implementation of Cross-lingual Language Model Pretraining.☆2,924Feb 14, 2023Updated 3 years ago
- Google Research☆37,261Updated this week
- DeLighT: Very Deep and Light-Weight Transformers☆469Oct 16, 2020Updated 5 years ago
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆9,491Feb 6, 2026Updated last week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆22,021Jan 23, 2026Updated 3 weeks ago
- Fast Block Sparse Matrices for Pytorch☆549Jan 21, 2021Updated 5 years ago
- higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual tr…☆1,627Mar 25, 2022Updated 3 years ago
- BertViz: Visualize Attention in Transformer Models☆7,907Jan 8, 2026Updated last month
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Py…☆24,993Updated this week
- An open-source NLP research library, built on PyTorch.☆11,890Nov 22, 2022Updated 3 years ago