davisyoshida / qaxLinks
If it quacks like a tensor...
☆59Updated last year
Alternatives and similar repositories for qax
Users that are interested in qax are comparing it to the libraries listed below
Sorting:
- LoRA for arbitrary JAX models and functions☆144Updated last year
- A simple library for scaling up JAX programs☆144Updated 2 months ago
- JAX Synergistic Memory Inspector☆184Updated last year
- Einsum-like high-level array sharding API for JAX☆34Updated last year
- JMP is a Mixed Precision library for JAX.☆211Updated last year
- Minimal but scalable implementation of large language models in JAX☆35Updated 2 months ago
- JAX implementation of the Mistral 7b v0.2 model☆35Updated last year
- Inference code for LLaMA models in JAX☆120Updated last year
- Named Tensors for Legible Deep Learning in JAX☆218Updated 2 months ago
- Minimal yet performant LLM examples in pure JAX☆236Updated 2 weeks ago
- Pytorch-like dataloaders for JAX.☆98Updated last month
- Machine Learning eXperiment Utilities☆48Updated 6 months ago
- seqax = sequence modeling + JAX☆170Updated 6 months ago
- Jax/Flax rewrite of Karpathy's nanoGPT☆63Updated 2 years ago
- A functional training loops library for JAX☆88Updated last year
- Implementation of PSGD optimizer in JAX☆35Updated last year
- JAX Arrays for human consumption☆110Updated 3 months ago
- Tokamax: A GPU and TPU kernel library.☆165Updated last week
- Train very large language models in Jax.☆210Updated 2 years ago
- TPU pod commander is a package for managing and launching jobs on Google Cloud TPU pods.☆21Updated 4 months ago
- A set of Python scripts that makes your experience on TPU better☆56Updated 4 months ago
- ☆123Updated 7 months ago
- Distributed pretraining of large language models (LLMs) on cloud TPU slices, with Jax and Equinox.☆24Updated last year
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆92Updated 2 years ago
- Automatically take good care of your preemptible TPUs☆37Updated 2 years ago
- A flexible and efficient implementation of Flash Attention 2.0 for JAX, supporting multiple backends (GPU/TPU/CPU) and platforms (Triton/…☆34Updated 10 months ago
- JAX implementation of the Llama 2 model☆216Updated 2 years ago
- A library for unit scaling in PyTorch☆133Updated 6 months ago
- Run PyTorch in JAX. 🤝☆311Updated 3 months ago
- ☆18Updated last year