tech-srl / RASP
An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers"
☆304Updated 6 months ago
Alternatives and similar repositories for RASP:
Users that are interested in RASP are comparing it to the libraries listed below
- An interactive exploration of Transformer programming.☆261Updated last year
- Neural Networks and the Chomsky Hierarchy☆204Updated 11 months ago
- ☆525Updated last year
- Python library which enables complex compositions of language models such as scratchpads, chain of thought, tool use, selection-inference…☆204Updated 2 months ago
- Language Modeling with the H3 State Space Model☆517Updated last year
- [NeurIPS 2023] Learning Transformer Programs☆159Updated 10 months ago
- Emergent world representations: Exploring a sequence model trained on a synthetic task☆178Updated last year
- ☆253Updated 2 years ago
- ☆215Updated 8 months ago
- ☆165Updated last year
- See the issue board for the current status of active and prospective projects!☆65Updated 3 years ago
- Train very large language models in Jax.☆203Updated last year
- ☆343Updated 11 months ago
- Language-annotated Abstraction and Reasoning Corpus☆83Updated last year
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆177Updated last month
- Tools for understanding how transformer predictions are built layer-by-layer☆480Updated 9 months ago
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆236Updated last year
- seqax = sequence modeling + JAX☆150Updated last week
- Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate …☆631Updated last year
- Mechanistic Interpretability Visualizations using React☆235Updated 3 months ago
- JAX implementation of the Llama 2 model☆216Updated last year
- Puzzles for exploring transformers☆335Updated last year
- Seminar on Large Language Models (COMP790-101 at UNC Chapel Hill, Fall 2022)☆310Updated 2 years ago
- Mechanistic Interpretability for Transformer Models☆50Updated 2 years ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆557Updated this week
- A puzzle to learn about prompting☆124Updated last year
- Formal to Formal Mathematics Benchmark☆338Updated last year
- Materials for ConceptARC paper☆89Updated 4 months ago
- Simple and efficient RevNet-Library for PyTorch with XLA and DeepSpeed support and parameter offload☆126Updated 2 years ago
- Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale, TACL (2022)☆124Updated 5 months ago