AI-Hypercomputer / cloud-tpu-monitoring-debuggingLinks
☆12Updated 5 months ago
Alternatives and similar repositories for cloud-tpu-monitoring-debugging
Users that are interested in cloud-tpu-monitoring-debugging are comparing it to the libraries listed below
Sorting:
- ☆142Updated last week
- xpk (Accelerated Processing Kit, pronounced x-p-k,) is a software tool to help Cloud developers to orchestrate training jobs on accelerat…☆136Updated this week
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆522Updated last week
- ☆39Updated 3 weeks ago
- Orbax provides common checkpointing and persistence utilities for JAX users☆409Updated this week
- Jax/Flax rewrite of Karpathy's nanoGPT☆59Updated 2 years ago
- JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs wel…☆365Updated last month
- ☆187Updated last week
- ☆275Updated last year
- JAX-Toolbox☆329Updated this week
- Accelerate, Optimize performance with streamlined training and serving options with JAX.☆293Updated this week
- jax-triton contains integrations between JAX and OpenAI Triton☆412Updated last month
- ☆519Updated last year
- Library for reading and processing ML training data.☆487Updated last week
- ☆16Updated 4 months ago
- A simple library for scaling up JAX programs☆140Updated 9 months ago
- ☆141Updated last week
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆290Updated 11 months ago
- ☆238Updated last week
- ☆323Updated last week
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆630Updated this week
- JAX implementation of the Llama 2 model☆219Updated last year
- CLU lets you write beautiful training loops in JAX.☆351Updated last month
- Modular, scalable library to train ML models☆143Updated this week
- ☆115Updated last week
- Train very large language models in Jax.☆206Updated last year
- Implementation of Flash Attention in Jax☆215Updated last year
- Scalable and Performant Data Loading☆291Updated this week
- Fault tolerance for PyTorch (HSDP, LocalSGD, DiLoCo, Streaming DiLoCo)☆377Updated this week
- seqax = sequence modeling + JAX☆165Updated 2 weeks ago