packquickly / schedule_free_optx
Schedule free optimiser implemented in JAX using Optimistix
☆14Updated 3 months ago
Related projects: ⓘ
- ☆30Updated this week
- ☆14Updated 3 weeks ago
- Scalable neural net training via automatic normalization in the modular norm.☆108Updated last month
- Einsum-like high-level array sharding API for JAX☆31Updated 2 months ago
- Pytorch-like dataloaders in JAX.☆52Updated last month
- ☆53Updated 8 months ago
- Transformer with Mu-Parameterization, implemented in Jax/Flax. Supports FSDP on TPU pods.☆29Updated 3 weeks ago
- ☆18Updated last month
- If it quacks like a tensor...☆48Updated 7 months ago
- ☆25Updated 5 months ago
- LoRA for arbitrary JAX models and functions☆127Updated 6 months ago
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆74Updated 7 months ago
- ☆33Updated 8 months ago
- ☆28Updated last week
- seqax = sequence modeling + JAX☆129Updated 2 months ago
- A MAD laboratory to improve AI architecture designs 🧪☆84Updated 4 months ago
- Named Tensors for Legible Deep Learning in JAX☆147Updated this week
- ☆27Updated 2 months ago
- JAX implementation of the Mistral 7b v0.2 model☆32Updated 2 months ago
- Experiment of using Tangent to autodiff triton☆66Updated 7 months ago
- Run PyTorch in JAX. 🤝☆187Updated 10 months ago
- A simple library for scaling up JAX programs☆116Updated last month
- A State-Space Model with Rational Transfer Function Representation.☆61Updated 4 months ago
- A set of Python scripts that makes your experience on TPU better☆37Updated 2 months ago
- PyTorch half precision gemm lib w/ fused optional bias + optional relu/gelu☆25Updated 3 weeks ago
- ☆56Updated 2 years ago
- Personal solutions to the Triton Puzzles☆11Updated 2 months ago
- Multiple dispatch over abstract array types in JAX.☆100Updated last month
- ☆42Updated 3 months ago
- ☆180Updated 2 months ago