ayaka14732 / tpu-starterLinks
Everything you want to know about Google Cloud TPU
☆538Updated last year
Alternatives and similar repositories for tpu-starter
Users that are interested in tpu-starter are comparing it to the libraries listed below
Sorting:
- JAX implementation of the Llama 2 model☆219Updated last year
- JAX Synergistic Memory Inspector☆177Updated last year
- Puzzles for exploring transformers☆355Updated 2 years ago
- Annotated version of the Mamba paper☆487Updated last year
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆389Updated this week
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆630Updated this week
- ☆275Updated last year
- Implementation of Flash Attention in Jax☆215Updated last year
- ☆361Updated last year
- Inference code for LLaMA models in JAX☆118Updated last year
- Accelerate, Optimize performance with streamlined training and serving options with JAX.☆296Updated this week
- ☆442Updated 9 months ago
- ☆187Updated last week
- jax-triton contains integrations between JAX and OpenAI Triton☆412Updated last month
- Implementation of a Transformer, but completely in Triton☆273Updated 3 years ago
- ☆519Updated last year
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆523Updated last week
- A subset of PyTorch's neural network modules, written in Python using OpenAI's Triton.☆567Updated this week
- For optimization algorithm research and development.☆524Updated this week
- Home for "How To Scale Your Model", a short blog-style textbook about scaling LLMs on TPUs☆466Updated this week
- What would you do with 1000 H100s...☆1,080Updated last year
- Named tensors with first-class dimensions for PyTorch☆332Updated 2 years ago
- ☆307Updated last year
- NeurIPS Large Language Model Efficiency Challenge: 1 LLM + 1GPU + 1Day☆256Updated last year
- ☆323Updated last week
- Language Modeling with the H3 State Space Model☆519Updated last year
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆290Updated 11 months ago
- ☆232Updated 5 months ago
- Efficient optimizers☆253Updated last week
- LoRA for arbitrary JAX models and functions☆140Updated last year