jax-ml / jax-tpu-embeddingLinks
☆25Updated this week
Alternatives and similar repositories for jax-tpu-embedding
Users that are interested in jax-tpu-embedding are comparing it to the libraries listed below
Sorting:
- A simple library for scaling up JAX programs☆144Updated 3 weeks ago
- Tokamax: A GPU and TPU kernel library.☆116Updated this week
- ☆190Updated last week
- A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.☆125Updated 2 months ago
- ☆118Updated 3 weeks ago
- torchax is a PyTorch frontend for JAX. It gives JAX the ability to author JAX programs using familiar PyTorch syntax. It also provides JA…☆134Updated this week
- Experiment of using Tangent to autodiff triton☆80Updated last year
- Minimal yet performant LLM examples in pure JAX☆202Updated 2 months ago
- JMP is a Mixed Precision library for JAX.☆211Updated 10 months ago
- jax-triton contains integrations between JAX and OpenAI Triton☆436Updated this week
- ☆54Updated last year
- ☆147Updated 3 weeks ago
- Einsum-like high-level array sharding API for JAX☆34Updated last year
- seqax = sequence modeling + JAX☆168Updated 4 months ago
- JAX Synergistic Memory Inspector☆182Updated last year
- ☆337Updated last week
- JAX-Toolbox☆364Updated this week
- Write a fast kernel and run it on Discord. See how you compare against the best!☆61Updated last week
- Machine Learning eXperiment Utilities☆46Updated 4 months ago
- Implementation of Flash Attention in Jax☆222Updated last year
- JAX bindings for Flash Attention v2☆99Updated 3 weeks ago
- A FlashAttention implementation for JAX with support for efficient document mask computation and context parallelism.☆149Updated 2 weeks ago
- A stand-alone implementation of several NumPy dtype extensions used in machine learning.☆312Updated this week
- Named Tensors for Legible Deep Learning in JAX☆212Updated 3 weeks ago
- A performant, memory-efficient checkpointing library for PyTorch applications, designed with large, complex distributed workloads in mind…☆161Updated 2 months ago
- Train very large language models in Jax.☆210Updated 2 years ago
- Demo of the unit_scaling library, showing how a model can be easily adapted to train in FP8.☆46Updated last year
- JAX implementation of the Mistral 7b v0.2 model☆35Updated last year
- ☆62Updated 3 years ago
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated last year