google / saxml
☆136Updated last week
Alternatives and similar repositories for saxml:
Users that are interested in saxml are comparing it to the libraries listed below
- ☆184Updated 2 weeks ago
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆482Updated last month
- JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs wel…☆290Updated this week
- PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"☆53Updated last month
- ☆288Updated this week
- xpk (Accelerated Processing Kit, pronounced x-p-k,) is a software tool to help Cloud developers to orchestrate training jobs on accelerat…☆106Updated this week
- JAX-Toolbox☆286Updated this week
- Inference code for LLaMA models in JAX☆116Updated 9 months ago
- jax-triton contains integrations between JAX and OpenAI Triton☆382Updated this week
- Testing framework for Deep Learning models (Tensorflow and PyTorch) on Google Cloud hardware accelerators (TPU and GPU)☆64Updated 3 months ago
- Implementation of Flash Attention in Jax☆206Updated last year
- ☆67Updated 2 years ago
- seqax = sequence modeling + JAX☆147Updated last week
- A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.☆108Updated 2 weeks ago
- Orbax provides common checkpointing and persistence utilities for JAX users☆344Updated this week
- Implementation of a Transformer, but completely in Triton☆259Updated 2 years ago
- ☆20Updated last week
- This repository contains the experimental PyTorch native float8 training UX☆221Updated 7 months ago
- PyTorch RFCs (experimental)☆130Updated 6 months ago
- Train very large language models in Jax.☆203Updated last year
- ☆193Updated this week
- JAX implementation of the Llama 2 model☆216Updated last year
- ☆397Updated 7 months ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆554Updated this week
- A stand-alone implementation of several NumPy dtype extensions used in machine learning.☆254Updated this week
- JAX Synergistic Memory Inspector☆168Updated 7 months ago
- ☆186Updated 2 weeks ago
- Google TPU optimizations for transformers models☆102Updated last month
- 🚀 Efficiently (pre)training foundation models with native PyTorch features, including FSDP for training and SDPA implementation of Flash…☆226Updated this week
- A simple library for scaling up JAX programs☆133Updated 4 months ago