jysohn1108 / Looped-TransformerLinks
Official implementation of the transformer (TF) architecture suggested in a paper entitled "Looped Transformers as Programmable Computers"
☆27Updated 2 years ago
Alternatives and similar repositories for Looped-Transformer
Users that are interested in Looped-Transformer are comparing it to the libraries listed below
Sorting:
- ☆33Updated last year
- ☆23Updated 9 months ago
- ☆86Updated last year
- ☆53Updated last year
- ☆48Updated last year
- Universal Neurons in GPT2 Language Models☆30Updated last year
- Official repository of paper "RNNs Are Not Transformers (Yet): The Key Bottleneck on In-context Retrieval"☆27Updated last year
- [NeurIPS 2024 Spotlight] Code and data for the paper "Finding Transformer Circuits with Edge Pruning".☆61Updated 2 months ago
- ☆103Updated last year
- ☆45Updated 2 years ago
- Code for NeurIPS 2024 Spotlight: "Scaling Laws and Compute-Optimal Training Beyond Fixed Training Durations"☆84Updated last year
- [NeurIPS 2023] Learning Transformer Programs☆162Updated last year
- ☆33Updated last year
- ☆73Updated last year
- A library for efficient patching and automatic circuit discovery.☆78Updated 3 months ago
- Self-Supervised Alignment with Mutual Information☆21Updated last year
- Parallelizing non-linear sequential models over the sequence length☆54Updated 4 months ago
- Code for the paper "The Impact of Positional Encoding on Length Generalization in Transformers", NeurIPS 2023☆136Updated last year
- The repository contains code for Adaptive Data Optimization☆26Updated 10 months ago
- ☆33Updated 9 months ago
- ☆23Updated last year
- Learning from preferences is a common paradigm for fine-tuning language models. Yet, many algorithmic design decisions come into play. Ou…☆32Updated last year
- Hrrformer: A Neuro-symbolic Self-attention Model (ICML23)☆58Updated 3 weeks ago
- Revisiting Efficient Training Algorithms For Transformer-based Language Models (NeurIPS 2023)☆80Updated 2 years ago
- Experiments on the impact of depth in transformers and SSMs.☆36Updated this week
- Official implementation of Bootstrapping Language Models via DPO Implicit Rewards☆44Updated 6 months ago
- ☆38Updated last year
- ☆70Updated 3 years ago
- ☆83Updated 2 years ago
- Mamba support for transformer lens☆18Updated last year