kvfrans / jax-diffusion-transformerLinks
Implementation of Diffusion Transformer (DiT) in JAX
☆300Updated last year
Alternatives and similar repositories for jax-diffusion-transformer
Users that are interested in jax-diffusion-transformer are comparing it to the libraries listed below
Sorting:
- ☆287Updated last year
- Minimal yet performant LLM examples in pure JAX☆225Updated last week
- For optimization algorithm research and development.☆556Updated 3 weeks ago
- UNet diffusion model in pure CUDA☆659Updated last year
- Annotated version of the Mamba paper☆493Updated last year
- 🧱 Modula software package☆322Updated 4 months ago
- CIFAR-10 speedruns: 94% in 2.6 seconds and 96% in 27 seconds☆340Updated last month
- A simple library for scaling up JAX programs☆144Updated 2 months ago
- ☆234Updated last year
- Efficient optimizers☆280Updated 2 weeks ago
- Dion optimizer algorithm☆413Updated this week
- Quick implementation of nGPT, learning entirely on the hypersphere, from NvidiaAI☆294Updated 7 months ago
- A zero-to-one guide on scaling modern transformers with n-dimensional parallelism.☆112Updated last week
- ☆122Updated 6 months ago
- supporting pytorch FSDP for optimizers☆84Updated last year
- Flow-matching algorithms in JAX☆113Updated last year
- Accelerated First Order Parallel Associative Scan☆192Updated this week
- Run PyTorch in JAX. 🤝☆309Updated 2 months ago
- seqax = sequence modeling + JAX☆169Updated 5 months ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆181Updated 6 months ago
- Normalized Transformer (nGPT)☆195Updated last year
- Universal Notation for Tensor Operations in Python.☆456Updated 9 months ago
- Latent Program Network (from the "Searching Latent Program Spaces" paper)☆107Updated last month
- JAX-Toolbox☆373Updated this week
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆688Updated 2 weeks ago
- ☆91Updated last year
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆132Updated last year
- Jax/Flax rewrite of Karpathy's nanoGPT☆62Updated 2 years ago
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆297Updated last year
- FlexAttention based, minimal vllm-style inference engine for fast Gemma 2 inference.☆329Updated 2 months ago