ml-gde / jaxgardenLinks
A collection of reusable, high-performance, well-documented, thorough-tested layers and models in Jax
☆20Updated 2 months ago
Alternatives and similar repositories for jaxgarden
Users that are interested in jaxgarden are comparing it to the libraries listed below
Sorting:
- Slide decks, coding exercises, and quick references for learning the JAX AI Stack☆33Updated last week
- JAX Synergistic Memory Inspector☆178Updated last year
- Implementation of Flash Attention in Jax☆216Updated last year
- JAX implementation of the Llama 2 model☆219Updated last year
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆290Updated 11 months ago
- 🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.☆17Updated 2 months ago
- ☆188Updated 3 weeks ago
- (EasyDel Former) is a utility library designed to simplify and enhance the development in JAX☆28Updated last week
- ☆361Updated last year
- Accelerate, Optimize performance with streamlined training and serving options with JAX.☆301Updated this week
- Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*☆87Updated last year
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆187Updated 3 years ago
- A set of Python scripts that makes your experience on TPU better☆54Updated last year
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆643Updated this week
- Various transformers for FSDP research☆38Updated 2 years ago
- ☆67Updated 3 years ago
- Inference code for LLaMA models in JAX☆118Updated last year
- Train very large language models in Jax.☆206Updated last year
- A JAX-native LLM Post-Training Library☆116Updated last week
- Pytorch/XLA SPMD Test code in Google TPU☆23Updated last year
- This is a port of Mistral-7B model in JAX☆32Updated last year
- Unofficial JAX implementations of deep learning research papers☆156Updated 3 years ago
- A minimal PyTorch Lightning OpenAI GPT w DeepSpeed Training!☆113Updated 2 years ago
- Minimal yet performant LLM examples in pure JAX☆148Updated this week
- ☆19Updated 2 years ago
- JAX Implementation of Black Forest Labs' Flux.1 family of models☆35Updated last week
- some common Huggingface transformers in maximal update parametrization (µP)☆82Updated 3 years ago
- Google TPU optimizations for transformers models☆118Updated 7 months ago
- LoRA for arbitrary JAX models and functions☆141Updated last year
- A fast implementation of T5/UL2 in PyTorch using Flash Attention☆107Updated 5 months ago