AI-Hypercomputer / cloud-accelerator-diagnostics
☆20Updated last week
Alternatives and similar repositories for cloud-accelerator-diagnostics
Users that are interested in cloud-accelerator-diagnostics are comparing it to the libraries listed below
Sorting:
- ☆186Updated 2 weeks ago
- A simple library for scaling up JAX programs☆134Updated 6 months ago
- ☆138Updated 2 weeks ago
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆496Updated 2 weeks ago
- seqax = sequence modeling + JAX☆155Updated last month
- jax-triton contains integrations between JAX and OpenAI Triton☆393Updated 2 weeks ago
- ☆301Updated last week
- JAX implementation of the Llama 2 model☆218Updated last year
- Implementation of Flash Attention in Jax☆209Updated last year
- Two implementations of ZeRO-1 optimizer sharding in JAX☆14Updated last year
- ☆109Updated this week
- JAX Synergistic Memory Inspector☆173Updated 10 months ago
- ☆217Updated 10 months ago
- xpk (Accelerated Processing Kit, pronounced x-p-k,) is a software tool to help Cloud developers to orchestrate training jobs on accelerat…☆120Updated last week
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆569Updated this week
- JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs wel…☆325Updated this week
- JAX-Toolbox☆302Updated this week
- ☆79Updated 10 months ago
- Orbax provides common checkpointing and persistence utilities for JAX users☆378Updated last week
- JAX bindings for Flash Attention v2☆88Updated 10 months ago
- A set of Python scripts that makes your experience on TPU better☆53Updated 10 months ago
- ☆455Updated 10 months ago
- PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"☆60Updated last month
- Home for "How To Scale Your Model", a short blog-style textbook about scaling LLMs on TPUs☆269Updated 3 weeks ago
- Named Tensors for Legible Deep Learning in JAX☆173Updated 2 weeks ago
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated 7 months ago
- LoRA for arbitrary JAX models and functions☆136Updated last year
- Inference code for LLaMA models in JAX☆118Updated 11 months ago
- PyTorch centric eager mode debugger☆47Updated 5 months ago
- Jax/Flax rewrite of Karpathy's nanoGPT☆57Updated 2 years ago