tech-srl / RASP
An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers"
☆286Updated 2 months ago
Related projects ⓘ
Alternatives and complementary repositories for RASP
- An interactive exploration of Transformer programming.☆247Updated last year
- ☆508Updated 9 months ago
- Neural Networks and the Chomsky Hierarchy☆187Updated 7 months ago
- ☆161Updated last year
- ☆334Updated 7 months ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆518Updated this week
- Python library which enables complex compositions of language models such as scratchpads, chain of thought, tool use, selection-inference…☆197Updated 5 months ago
- Seminar on Large Language Models (COMP790-101 at UNC Chapel Hill, Fall 2022)☆308Updated 2 years ago
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆175Updated 2 years ago
- Tools for understanding how transformer predictions are built layer-by-layer☆432Updated 5 months ago
- Train very large language models in Jax.☆195Updated last year
- Erasing concepts from neural representations with provable guarantees☆210Updated last week
- ☆197Updated 4 months ago
- Language Modeling with the H3 State Space Model☆514Updated last year
- Named tensors with first-class dimensions for PyTorch☆322Updated last year
- Mistral: A strong, northwesterly wind: Framework for transparent and accessible large-scale language model training, built with Hugging F…☆564Updated last year
- A library for bridging Python and HTML/Javascript (via Svelte) for creating interactive visualizations☆174Updated 2 years ago
- Puzzles for exploring transformers☆325Updated last year
- [NeurIPS 2023] Learning Transformer Programs☆157Updated 6 months ago
- A domain-specific probabilistic programming language for modeling and inference with language models☆112Updated last year
- Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate …☆625Updated last year
- Implementation of https://srush.github.io/annotated-s4☆469Updated last year
- JAX implementation of the Llama 2 model☆210Updated 9 months ago
- Notebooks accompanying Anthropic's "Toy Models of Superposition" paper☆97Updated 2 years ago
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆237Updated last year
- Emergent world representations: Exploring a sequence model trained on a synthetic task☆170Updated last year
- Extract full next-token probabilities via language model APIs☆229Updated 9 months ago
- ☆57Updated 2 years ago
- ☆251Updated 2 years ago
- Inference code for LLaMA models in JAX☆113Updated 6 months ago