google / flaxformer
☆322Updated 5 months ago
Related projects: ⓘ
- Implementation of Flash Attention in Jax☆188Updated 6 months ago
- ☆172Updated last week
- Task-based datasets, preprocessing, and evaluation for sequence models.☆552Updated this week
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆445Updated last week
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆184Updated 2 years ago
- Implementation of a Transformer, but completely in Triton☆242Updated 2 years ago
- jax-triton contains integrations between JAX and OpenAI Triton☆328Updated this week
- Train very large language models in Jax.☆191Updated 10 months ago
- Inference code for LLaMA models in JAX☆108Updated 3 months ago
- Named tensors with first-class dimensions for PyTorch☆321Updated last year
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆492Updated this week
- JAX Synergistic Memory Inspector☆161Updated 2 months ago
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆236Updated last year
- For optimization algorithm research and development.☆240Updated last week
- ☆64Updated 2 years ago
- ☆247Updated this week
- ☆201Updated 2 months ago
- Sequence modeling with Mega.☆296Updated last year
- JAX implementation of the Llama 2 model☆205Updated 7 months ago
- Code for the ALiBi method for transformer language models (ICLR 2022)☆497Updated 10 months ago
- ☆129Updated last year
- ☆253Updated this week
- CLU lets you write beautiful training loops in JAX.☆318Updated 3 weeks ago
- ☆160Updated last year
- Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT☆202Updated 3 weeks ago
- Implementation of the conditionally routed attention in the CoLT5 architecture, in Pytorch☆222Updated last week
- ☆243Updated last year
- JMP is a Mixed Precision library for JAX.☆183Updated 3 months ago
- A minimal PyTorch Lightning OpenAI GPT w DeepSpeed Training!☆111Updated last year
- Long Range Arena for Benchmarking Efficient Transformers☆711Updated 9 months ago