yixiaoer / tpux
A set of Python scripts that makes your experience on TPU better
☆37Updated 2 months ago
Related projects: ⓘ
- ☆11Updated 2 months ago
- seqax = sequence modeling + JAX☆129Updated 2 months ago
- JAX implementation of the Mistral 7b v0.2 model☆32Updated 2 months ago
- ☆68Updated 2 months ago
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆110Updated 5 months ago
- ☆20Updated last year
- Scalable neural net training via automatic normalization in the modular norm.☆108Updated 3 weeks ago
- A simple library for scaling up JAX programs☆116Updated last month
- Experiments for efforts to train a new and improved t5☆76Updated 5 months ago
- ☆27Updated this week
- A fast implementation of T5/UL2 in PyTorch using Flash Attention☆60Updated 2 months ago
- Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*☆77Updated 9 months ago
- JAX implementation of the Llama 2 model☆205Updated 7 months ago
- Inference code for LLaMA models in JAX☆108Updated 3 months ago
- LoRA for arbitrary JAX models and functions☆127Updated 6 months ago
- Machine Learning eXperiment Utilities☆42Updated 3 months ago
- ☆53Updated 8 months ago
- ☆48Updated 3 months ago
- ☆56Updated 2 years ago
- JAX bindings for Flash Attention v2☆75Updated 2 months ago
- Cold Compress is a hackable, lightweight, and open-source toolkit for creating and benchmarking cache compression methods built on top of…☆73Updated last month
- If it quacks like a tensor...☆48Updated 7 months ago
- A MAD laboratory to improve AI architecture designs 🧪☆84Updated 4 months ago
- some common Huggingface transformers in maximal update parametrization (µP)☆76Updated 2 years ago
- Minimal but scalable implementation of large language models in JAX☆17Updated 3 weeks ago
- Understand and test language model architectures on synthetic tasks.☆156Updated 4 months ago
- Simple and efficient pytorch-native transformer training and inference (batched)☆53Updated 5 months ago
- Transformer with Mu-Parameterization, implemented in Jax/Flax. Supports FSDP on TPU pods.☆29Updated 3 weeks ago
- ☆54Updated last week
- Scaling is a distributed training library and installable dependency designed to scale up neural networks, with a dedicated module for tr…☆38Updated 2 weeks ago