Implementation of Feedback Transformer in Pytorch
☆108Mar 2, 2021Updated 5 years ago
Alternatives and similar repositories for feedback-transformer-pytorch
Users that are interested in feedback-transformer-pytorch are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Implementation of Memformer, a Memory-augmented Transformer, in Pytorch☆126Nov 13, 2020Updated 5 years ago
- Implementation of Token Shift GPT - An autoregressive model that solely relies on shifting the sequence space for mixing☆49Jan 27, 2022Updated 4 years ago
- Another attempt at a long-context / efficient transformer by me☆38Apr 11, 2022Updated 4 years ago
- Implementation of a Transformer that Ponders, using the scheme from the PonderNet paper☆81Oct 30, 2021Updated 4 years ago
- A variant of Transformer-XL where the memory is updated not with a queue, but with attention☆49Jul 31, 2020Updated 5 years ago
- Deploy to Railway using AI coding agents - Free Credits Offer • AdUse Claude Code, Codex, OpenCode, and more. Autonomous software development now has the infrastructure to match with Railway.
- Implementation of Hierarchical Transformer Memory (HTM) for Pytorch☆76Sep 15, 2021Updated 4 years ago
- Implementation of Multistream Transformers in Pytorch☆54Jul 31, 2021Updated 4 years ago
- Implementation of Transformer in Transformer, pixel level attention paired with patch level attention for image classification, in Pytorc…☆310Dec 27, 2021Updated 4 years ago
- A python library for highly configurable transformers - easing model architecture search and experimentation.☆48Nov 30, 2021Updated 4 years ago
- Fully featured implementation of Routing Transformer☆300Nov 6, 2021Updated 4 years ago
- Implementation of Fast Transformer in Pytorch☆176Aug 26, 2021Updated 4 years ago
- Implementation of Bottleneck Transformer in Pytorch☆677Sep 20, 2021Updated 4 years ago
- Implementation of the Remixer Block from the Remixer paper, in Pytorch☆36Sep 27, 2021Updated 4 years ago
- Implementation of a Transformer using ReLA (Rectified Linear Attention) from https://arxiv.org/abs/2104.07012☆49Apr 6, 2022Updated 4 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- Implementation of Nyström Self-attention, from the paper Nyströmformer☆145Mar 24, 2025Updated last year
- Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch☆120Aug 4, 2021Updated 4 years ago
- An implementation of Performer, a linear attention-based transformer, in Pytorch☆1,177Feb 2, 2022Updated 4 years ago
- A GPT, made only of MLPs, in Jax☆59Jun 23, 2021Updated 4 years ago
- Implementation of COCO-LM, Correcting and Contrasting Text Sequences for Language Model Pretraining, in Pytorch☆46Mar 3, 2021Updated 5 years ago
- A simple implementation of a deep linear Pytorch module☆21Oct 16, 2020Updated 5 years ago
- Implementation of the Adan (ADAptive Nesterov momentum algorithm) Optimizer in Pytorch☆253Sep 1, 2022Updated 3 years ago
- Graph neural network message passing reframed as a Transformer with local attention☆70Dec 24, 2022Updated 3 years ago
- (unofficial) - customized fork of DETR, optimized for intelligent obj detection on 'real world' custom datasets☆12Aug 22, 2020Updated 5 years ago
- Bare Metal GPUs on DigitalOcean Gradient AI • AdPurpose-built for serious AI teams training foundational models, running large-scale inference, and pushing the boundaries of what's possible.
- An attempt at the implementation of Glom, Geoffrey Hinton's new idea that integrates concepts from neural fields, top-down-bottom-up proc…☆196Mar 27, 2021Updated 5 years ago
- Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch☆1,200Aug 22, 2023Updated 2 years ago
- Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules☆58Dec 2, 2020Updated 5 years ago
- Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning☆166Feb 12, 2024Updated 2 years ago
- An implementation of Transformer with Expire-Span, a circuit for learning which memories to retain☆34Oct 30, 2020Updated 5 years ago
- GPT, but made only out of MLPs☆89May 25, 2021Updated 4 years ago
- Authors implementation of LieTransformer: Equivariant Self-Attention for Lie Groups☆36Feb 5, 2021Updated 5 years ago
- Trains Transformer model variants. Data isn't shuffled between batches.☆145Oct 5, 2022Updated 3 years ago
- Pytorch library for fast transformer implementations☆1,767Mar 23, 2023Updated 3 years ago
- Deploy on Railway without the complexity - Free Credits Offer • AdConnect your repo and Railway handles the rest with instant previews. Quickly provision container image services, databases, and storage volumes.
- Implementation of RealFormer using pytorch☆101Dec 27, 2020Updated 5 years ago
- Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute☆1,530Nov 18, 2020Updated 5 years ago
- Implementation of gMLP, an all-MLP replacement for Transformers, in Pytorch☆430Aug 14, 2021Updated 4 years ago
- Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper☆155Apr 27, 2021Updated 4 years ago
- Implementation of Hourglass Transformer, in Pytorch, from Google and OpenAI☆99Dec 31, 2021Updated 4 years ago
- Repository for NeurIPS 2020 Spotlight "AdaBelief Optimizer: Adapting stepsizes by the belief in observed gradients"☆1,072Aug 9, 2024Updated last year
- Official implementation for Neural networks with recurrent generative feedback (NeurIPS 2020).☆22Nov 10, 2020Updated 5 years ago