ayaka14732 / tpu-starterLinks
Everything you want to know about Google Cloud TPU
☆532Updated last year
Alternatives and similar repositories for tpu-starter
Users that are interested in tpu-starter are comparing it to the libraries listed below
Sorting:
- JAX implementation of the Llama 2 model☆219Updated last year
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆617Updated this week
- JAX Synergistic Memory Inspector☆175Updated last year
- Puzzles for exploring transformers☆355Updated 2 years ago
- ☆358Updated last year
- For optimization algorithm research and development.☆521Updated this week
- Annotated version of the Mamba paper☆486Updated last year
- Implementation of Flash Attention in Jax☆213Updated last year
- Inference code for LLaMA models in JAX☆118Updated last year
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆388Updated last week
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆514Updated last week
- ☆186Updated last month
- jax-triton contains integrations between JAX and OpenAI Triton☆405Updated 3 weeks ago
- Implementation of a Transformer, but completely in Triton☆270Updated 3 years ago
- ☆440Updated 9 months ago
- What would you do with 1000 H100s...☆1,063Updated last year
- ☆322Updated 3 weeks ago
- Language Modeling with the H3 State Space Model☆520Updated last year
- ☆274Updated last year
- Named tensors with first-class dimensions for PyTorch☆332Updated 2 years ago
- Accelerate, Optimize performance with streamlined training and serving options with JAX.☆289Updated this week
- ☆166Updated 2 years ago
- ☆230Updated 5 months ago
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆291Updated 10 months ago
- Home for "How To Scale Your Model", a short blog-style textbook about scaling LLMs on TPUs☆430Updated last week
- Train very large language models in Jax.☆204Updated last year
- ☆512Updated last year
- JAX-Toolbox☆321Updated this week
- ☆304Updated last year
- A subset of PyTorch's neural network modules, written in Python using OpenAI's Triton.☆560Updated this week