google / qwixLinks
a Jax quantization library
☆71Updated this week
Alternatives and similar repositories for qwix
Users that are interested in qwix are comparing it to the libraries listed below
Sorting:
- ☆283Updated last week
- torchax is a PyTorch frontend for JAX. It gives JAX the ability to author JAX programs using familiar PyTorch syntax. It also provides JA…☆144Updated this week
- ☆68Updated last year
- ☆148Updated last month
- Minimal yet performant LLM examples in pure JAX☆204Updated 2 months ago
- A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.☆125Updated 2 months ago
- ☆190Updated 2 weeks ago
- JAX-Toolbox☆364Updated this week
- A simple library for scaling up JAX programs☆144Updated last month
- ☆119Updated 5 months ago
- JAX Implementation of Black Forest Labs' Flux.1 family of models☆39Updated 2 weeks ago
- An implementation of PSGD Kron second-order optimizer for PyTorch☆97Updated 4 months ago
- LoRA for arbitrary JAX models and functions☆143Updated last year
- FlashRNN - Fast RNN Kernels with I/O Awareness☆169Updated last month
- Modular, scalable library to train ML models☆176Updated this week
- PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"☆78Updated 2 months ago
- A FlashAttention implementation for JAX with support for efficient document mask computation and context parallelism.☆149Updated 3 weeks ago
- ☆35Updated last year
- 📄Small Batch Size Training for Language Models☆68Updated 2 months ago
- seqax = sequence modeling + JAX☆168Updated 4 months ago
- supporting pytorch FSDP for optimizers☆84Updated last year
- ☆285Updated last year
- ☆118Updated last month
- Train a SmolLM-style llm on fineweb-edu in JAX/Flax with an assortment of optimizers.☆18Updated 4 months ago
- Accelerated First Order Parallel Associative Scan☆192Updated last year
- Implementation of Flash Attention in Jax☆223Updated last year
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆174Updated 5 months ago
- Orbax provides common checkpointing and persistence utilities for JAX users☆458Updated this week
- Tiled Flash Linear Attention library for fast and efficient mLSTM Kernels.☆77Updated last week
- Transformer with Mu-Parameterization, implemented in Jax/Flax. Supports FSDP on TPU pods.☆32Updated 6 months ago