srush / raspyLinks
An interactive exploration of Transformer programming.
☆269Updated last year
Alternatives and similar repositories for raspy
Users that are interested in raspy are comparing it to the libraries listed below
Sorting:
- An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers"☆320Updated last year
- git extension for {collaborative, communal, continual} model development☆215Updated 11 months ago
- Puzzles for exploring transformers☆371Updated 2 years ago
- ☆546Updated last year
- ☆456Updated last year
- A puzzle to learn about prompting☆135Updated 2 years ago
- ☆283Updated last year
- Extract full next-token probabilities via language model APIs☆247Updated last year
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆669Updated this week
- Automatic gradient descent☆213Updated 2 years ago
- Resources from the EleutherAI Math Reading Group☆54Updated 7 months ago
- ☆166Updated 2 years ago
- 🧱 Modula software package☆287Updated 2 months ago
- seqax = sequence modeling + JAX☆167Updated 2 months ago
- ☆144Updated 2 years ago
- Simple Transformer in Jax☆139Updated last year
- Understand and test language model architectures on synthetic tasks.☆233Updated 3 weeks ago
- Implementing RASP transformer programming language https://arxiv.org/pdf/2106.06981.pdf.☆58Updated 4 years ago
- Python library which enables complex compositions of language models such as scratchpads, chain of thought, tool use, selection-inference…☆213Updated 4 months ago
- Functional local implementations of main model parallelism approaches☆96Updated 2 years ago
- gzip Predicts Data-dependent Scaling Laws☆34Updated last year
- Train very large language models in Jax.☆209Updated last year
- Neural Networks and the Chomsky Hierarchy☆210Updated last year
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆180Updated 5 months ago
- Language Modeling with the H3 State Space Model☆518Updated 2 years ago
- Minimalistic, extremely fast, and hackable researcher's toolbench for GPT models in 307 lines of code. Reaches <3.8 validation loss on wi…☆350Updated last year
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆242Updated 2 years ago
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆294Updated last year
- Notebooks accompanying Anthropic's "Toy Models of Superposition" paper☆129Updated 3 years ago
- Named Tensors for Legible Deep Learning in JAX☆210Updated this week