srush / Transformer-PuzzlesLinks
Puzzles for exploring transformers
☆347Updated 2 years ago
Alternatives and similar repositories for Transformer-Puzzles
Users that are interested in Transformer-Puzzles are comparing it to the libraries listed below
Sorting:
- ☆431Updated 7 months ago
- What would you do with 1000 H100s...☆1,048Updated last year
- A puzzle to learn about prompting☆127Updated 2 years ago
- ☆262Updated 10 months ago
- An interactive exploration of Transformer programming.☆264Updated last year
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆584Updated this week
- Annotated version of the Mamba paper☆482Updated last year
- Deep learning for dummies. All the practical details and useful utilities that go into working with real models.☆793Updated last month
- ☆474Updated 10 months ago
- A subset of PyTorch's neural network modules, written in Python using OpenAI's Triton.☆544Updated this week
- Resources for skilling up in AI alignment research engineering. Covers basics of deep learning, mechanistic interpretability, and RL.☆213Updated last year
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆381Updated this week
- Solve puzzles. Learn CUDA.☆64Updated last year
- Puzzles for learning Triton☆1,658Updated 6 months ago
- For optimization algorithm research and development.☆518Updated this week
- Building blocks for foundation models.☆500Updated last year
- ☆166Updated last year
- Fast bare-bones BPE for modern tokenizer training☆156Updated last month
- An implementation of the transformer architecture onto an Nvidia CUDA kernel☆183Updated last year
- Small scale distributed training of sequential deep learning models, built on Numpy and MPI.☆133Updated last year
- Home for "How To Scale Your Model", a short blog-style textbook about scaling LLMs on TPUs☆370Updated last month
- seqax = sequence modeling + JAX☆155Updated last month
- Named tensors with first-class dimensions for PyTorch☆329Updated last year
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆287Updated 9 months ago
- ☆301Updated 11 months ago
- jax-triton contains integrations between JAX and OpenAI Triton☆392Updated this week
- NeurIPS Large Language Model Efficiency Challenge: 1 LLM + 1GPU + 1Day☆254Updated last year
- Implementation of Diffusion Transformer (DiT) in JAX☆276Updated 11 months ago
- Minimalistic, extremely fast, and hackable researcher's toolbench for GPT models in 307 lines of code. Reaches <3.8 validation loss on wi…☆345Updated 10 months ago
- ☆156Updated last year