Reformer, the efficient Transformer, in Pytorch
☆2,190Jun 21, 2023Updated 2 years ago
Alternatives and similar repositories for reformer-pytorch
Users that are interested in reformer-pytorch are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- An implementation of Performer, a linear attention-based transformer, in Pytorch☆1,177Feb 2, 2022Updated 4 years ago
- Pytorch library for fast transformer implementations☆1,769Mar 23, 2023Updated 3 years ago
- Longformer: The Long-Document Transformer☆2,193Feb 8, 2023Updated 3 years ago
- Fully featured implementation of Routing Transformer☆300Nov 6, 2021Updated 4 years ago
- Transformer based on a variant of attention that is linear complexity in respect to sequence length☆828May 5, 2024Updated last year
- Virtual machines for every use case on DigitalOcean • AdGet dependable uptime with 99.99% SLA, simple security tools, and predictable monthly pricing with DigitalOcean's virtual machines, called Droplets.
- Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention☆270Aug 10, 2021Updated 4 years ago
- My take on a practical implementation of Linformer for Pytorch.☆424Jul 27, 2022Updated 3 years ago
- [ICLR 2020] Lite Transformer with Long-Short Range Attention☆610Jul 11, 2024Updated last year
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,839Apr 21, 2026Updated last week
- ☆3,699Sep 21, 2022Updated 3 years ago
- Transformer training code for sequential tasks☆610Sep 14, 2021Updated 4 years ago
- ☆220Jun 8, 2020Updated 5 years ago
- Implements Reformer: The Efficient Transformer in pytorch.☆86Mar 1, 2020Updated 6 years ago
- Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch☆5,629Feb 17, 2024Updated 2 years ago
- Wordpress hosting with auto-scaling - Free Trial Offer • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"☆1,611Aug 12, 2020Updated 5 years ago
- Trax — Deep Learning with Clear Code and Speed☆8,303Sep 26, 2025Updated 7 months ago
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆32,207Sep 30, 2025Updated 6 months ago
- Implementation of Linformer for Pytorch☆306Jan 5, 2024Updated 2 years ago
- Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute☆1,528Nov 18, 2020Updated 5 years ago
- Efficient Transformers for research, PyTorch and Tensorflow using Locality Sensitive Hashing☆97Jan 28, 2020Updated 6 years ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,508Jan 14, 2026Updated 3 months ago
- Fast, general, and tested differentiable structured prediction in PyTorch☆1,128Apr 20, 2022Updated 4 years ago
- XLNet: Generalized Autoregressive Pretraining for Language Understanding☆6,177May 28, 2023Updated 2 years ago
- End-to-end encrypted cloud storage - Proton Drive • AdSpecial offer: 40% Off Yearly / 80% Off First Month. Protect your most important files, photos, and documents from prying eyes.
- ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators☆2,372Mar 23, 2024Updated 2 years ago
- FastFormers - highly efficient transformer models for NLU☆709Mar 21, 2025Updated last year
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆9,468Apr 19, 2026Updated last week
- DeLighT: Very Deep and Light-Weight Transformers☆469Oct 16, 2020Updated 5 years ago
- Fast Block Sparse Matrices for Pytorch☆550Jan 21, 2021Updated 5 years ago
- An implementation of local windowed attention for language modeling☆498Jul 16, 2025Updated 9 months ago
- Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch☆1,200Aug 22, 2023Updated 2 years ago
- Understanding the Difficulty of Training Transformers☆332May 31, 2022Updated 3 years ago
- A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch☆8,950Apr 20, 2026Updated last week
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- PyTorch original implementation of Cross-lingual Language Model Pretraining.☆2,932Feb 14, 2023Updated 3 years ago
- Single Headed Attention RNN - "Stop thinking with your head"☆1,181Nov 27, 2021Updated 4 years ago
- Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.☆31,079Updated this week
- PyTorch extensions for high performance and large scale training.☆3,407Apr 26, 2025Updated last year
- Google Research☆37,778Updated this week
- torch-optimizer -- collection of optimizers for Pytorch☆3,168Mar 22, 2024Updated 2 years ago
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆22,107Jan 23, 2026Updated 3 months ago