HenryNdubuaku / nanodlLinks
A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.
☆290Updated 10 months ago
Alternatives and similar repositories for nanodl
Users that are interested in nanodl are comparing it to the libraries listed below
Sorting:
- ☆273Updated 11 months ago
- Neural Networks for JAX☆84Updated 9 months ago
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆607Updated this week
- Named Tensors for Legible Deep Learning in JAX☆185Updated this week
- ☆114Updated this week
- 🧱 Modula software package☆202Updated 3 months ago
- JAX Synergistic Memory Inspector☆175Updated 11 months ago
- LoRA for arbitrary JAX models and functions☆140Updated last year
- A simple library for scaling up JAX programs☆139Updated 8 months ago
- A functional training loops library for JAX☆88Updated last year
- Implementation of Flash Attention in Jax☆213Updated last year
- Unofficial JAX implementations of deep learning research papers☆156Updated 3 years ago
- Automatic gradient descent☆208Updated 2 years ago
- For optimization algorithm research and development.☆521Updated this week
- jax-triton contains integrations between JAX and OpenAI Triton☆405Updated 2 weeks ago
- ☆132Updated last week
- JMP is a Mixed Precision library for JAX.☆206Updated 5 months ago
- Library for reading and processing ML training data.☆474Updated this week
- Run PyTorch in JAX. 🤝☆253Updated last week
- JAX-Toolbox☆320Updated this week
- ☆186Updated last month
- JAX implementation of the Llama 2 model☆219Updated last year
- Orbax provides common checkpointing and persistence utilities for JAX users☆404Updated this week
- Accelerate, Optimize performance with streamlined training and serving options with JAX.☆288Updated this week
- A stand-alone implementation of several NumPy dtype extensions used in machine learning.☆280Updated this week
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆386Updated last week
- Home for "How To Scale Your Model", a short blog-style textbook about scaling LLMs on TPUs☆424Updated this week
- CLU lets you write beautiful training loops in JAX.☆349Updated 2 weeks ago
- Modular, scalable library to train ML models☆133Updated this week
- Universal Tensor Operations in Einstein-Inspired Notation for Python.☆385Updated 3 months ago