graphcore-research / jax-experimental
JAX for Graphcore IPU (experimental)
☆21Updated 8 months ago
Related projects ⓘ
Alternatives and complementary repositories for jax-experimental
- ☆48Updated 3 months ago
- Experiment of using Tangent to autodiff triton☆72Updated 9 months ago
- Tensor Parallelism with JAX + Shard Map☆11Updated last year
- Einsum-like high-level array sharding API for JAX☆32Updated 4 months ago
- Turn jitted jax functions back into python source code☆20Updated 4 months ago
- If it quacks like a tensor...☆52Updated last week
- A simple library for scaling up JAX programs☆127Updated 2 weeks ago
- ☆36Updated 10 months ago
- ☆99Updated 4 months ago
- Named Tensors for Legible Deep Learning in JAX☆153Updated this week
- JMP is a Mixed Precision library for JAX.☆187Updated 6 months ago
- ☆14Updated last month
- seqax = sequence modeling + JAX☆133Updated 4 months ago
- A library for unit scaling in PyTorch☆105Updated 2 weeks ago
- JAX-Toolbox☆248Updated this week
- Parallel Associative Scan for Language Models☆18Updated 10 months ago
- Code implementing "Efficient Parallelization of a Ubiquitious Sequential Computation" (Heinsen, 2023)☆87Updated 10 months ago
- Automatic Differentiation for high-performance stencil loops☆12Updated 3 years ago
- extensible collectives library in triton☆72Updated last month
- This is a port of Mistral-7B model in JAX☆30Updated 4 months ago
- ☆23Updated 2 months ago
- Jax Decompiler☆13Updated 3 months ago
- Personal solutions to the Triton Puzzles☆16Updated 4 months ago
- Stencil computations in JAX☆66Updated last year
- Sparsity support for PyTorch☆31Updated this week
- jax-triton contains integrations between JAX and OpenAI Triton☆343Updated 3 weeks ago
- ☆16Updated 2 months ago
- Demo of the unit_scaling library, showing how a model can be easily adapted to train in FP8.☆35Updated 4 months ago
- FlexAttention w/ FlashAttention3 Support☆27Updated last month
- ☆17Updated 3 weeks ago