srush / raspy
An interactive exploration of Transformer programming.
☆258Updated last year
Alternatives and similar repositories for raspy:
Users that are interested in raspy are comparing it to the libraries listed below
- Puzzles for exploring transformers☆332Updated last year
- An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers"☆301Updated 5 months ago
- ☆416Updated 4 months ago
- A puzzle to learn about prompting☆124Updated last year
- git extension for {collaborative, communal, continual} model development☆207Updated 3 months ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆542Updated this week
- ☆211Updated 7 months ago
- Resources from the EleutherAI Math Reading Group☆52Updated 2 months ago
- ☆521Updated last year
- Extract full next-token probabilities via language model APIs☆229Updated 11 months ago
- ☆143Updated last year
- NeurIPS Large Language Model Efficiency Challenge: 1 LLM + 1GPU + 1Day☆255Updated last year
- Seminar on Large Language Models (COMP790-101 at UNC Chapel Hill, Fall 2022)☆310Updated 2 years ago
- JAX implementation of the Llama 2 model☆215Updated last year
- Solve puzzles. Learn CUDA.☆62Updated last year
- Python library which enables complex compositions of language models such as scratchpads, chain of thought, tool use, selection-inference…☆202Updated last month
- ☆164Updated last year
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆177Updated 2 weeks ago
- Named Tensors for Legible Deep Learning in JAX☆161Updated this week
- Erasing concepts from neural representations with provable guarantees☆222Updated 3 weeks ago
- A Jax-based library for designing and training transformer models from scratch.☆281Updated 5 months ago
- MinT: Minimal Transformer Library and Tutorials☆252Updated 2 years ago
- Simple Transformer in Jax☆136Updated 7 months ago
- Understand and test language model architectures on synthetic tasks.☆181Updated last month
- Tools for understanding how transformer predictions are built layer-by-layer☆475Updated 8 months ago
- Neural Networks and the Chomsky Hierarchy☆199Updated 10 months ago
- Highly commented implementations of Transformers in PyTorch☆132Updated last year
- Automatic gradient descent☆207Updated last year
- A comprehensive deep dive into the world of tokens☆220Updated 7 months ago
- Named tensors with first-class dimensions for PyTorch☆321Updated last year