openxla / tokamaxLinks
Tokamax: A GPU and TPU kernel library.
☆95Updated this week
Alternatives and similar repositories for tokamax
Users that are interested in tokamax are comparing it to the libraries listed below
Sorting:
- Minimal yet performant LLM examples in pure JAX☆187Updated last month
- JMP is a Mixed Precision library for JAX.☆209Updated 9 months ago
- A simple library for scaling up JAX programs☆144Updated last year
- jax-triton contains integrations between JAX and OpenAI Triton☆429Updated 2 weeks ago
- Einsum-like high-level array sharding API for JAX☆34Updated last year
- If it quacks like a tensor...☆59Updated 11 months ago
- Minimal, lightweight JAX implementations of popular models.☆117Updated this week
- JAX-Toolbox☆356Updated this week
- Named Tensors for Legible Deep Learning in JAX☆211Updated 2 weeks ago
- A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.☆123Updated last month
- LoRA for arbitrary JAX models and functions☆141Updated last year
- Orbax provides common checkpointing and persistence utilities for JAX users☆439Updated this week
- OpTree: Optimized PyTree Utilities☆195Updated last week
- JAX Synergistic Memory Inspector☆179Updated last year
- ☆114Updated this week
- Turn jitted jax functions back into python source code☆22Updated 10 months ago
- ☆116Updated this week
- Pytorch-like dataloaders for JAX.☆93Updated 5 months ago
- Implementation of Flash Attention in Jax☆219Updated last year
- A functional training loops library for JAX☆88Updated last year
- Run PyTorch in JAX. 🤝☆305Updated 3 weeks ago
- JAX bindings for Flash Attention v2☆97Updated last week
- ☆229Updated this week
- a Jax quantization library☆53Updated this week
- Multiple dispatch over abstract array types in JAX.☆134Updated 3 weeks ago
- seqax = sequence modeling + JAX☆168Updated 3 months ago
- Tensor Parallelism with JAX + Shard Map☆11Updated 2 years ago
- ☆234Updated 8 months ago
- Second Order Optimization and Curvature Estimation with K-FAC in JAX.☆294Updated 2 weeks ago
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated last year