divyamakkar0 / JAXformerLinks
A zero-to-one guide on scaling modern transformers with n-dimensional parallelism.
☆94Updated last week
Alternatives and similar repositories for JAXformer
Users that are interested in JAXformer are comparing it to the libraries listed below
Sorting:
- A repository to unravel the language of GPUs, making their kernel conversations easy to understand☆193Updated 4 months ago
- Simple Transformer in Jax☆139Updated last year
- ☆221Updated 7 months ago
- Dion optimizer algorithm☆360Updated this week
- Minimal yet performant LLM examples in pure JAX☆177Updated last week
- NanoGPT-speedrunning for the poor T4 enjoyers☆72Updated 5 months ago
- ☆281Updated last year
- ☆103Updated 2 weeks ago
- SIMD quantization kernels☆87Updated 3 weeks ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆164Updated 3 months ago
- seqax = sequence modeling + JAX☆167Updated 2 months ago
- 🧱 Modula software package☆277Updated last month
- Small scale distributed training of sequential deep learning models, built on Numpy and MPI.☆142Updated last year
- Quantized LLM training in pure CUDA/C++.☆32Updated this week
- PTX-Tutorial Written Purely By AIs (Deep Research of Openai and Claude 3.7)☆66Updated 6 months ago
- An extension of the nanoGPT repository for training small MOE models.☆195Updated 6 months ago
- ☆89Updated last year
- FlexAttention based, minimal vllm-style inference engine for fast Gemma 2 inference.☆280Updated last month
- ☆173Updated last year
- ☆28Updated last year
- an open source reproduction of NVIDIA's nGPT (Normalized Transformer with Representation Learning on the Hypersphere)☆105Updated 6 months ago
- Implementation of Diffusion Transformer (DiT) in JAX