PABannier / nanogradLinks
A lightweight deep learning framework
☆34Updated 4 years ago
Alternatives and similar repositories for nanograd
Users that are interested in nanograd are comparing it to the libraries listed below
Sorting:
- A Jax-based library for building transformers, includes implementations of GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more.☆297Updated last year
- This is a port of Mistral-7B model in JAX☆32Updated last year
- A really tiny autograd engine☆96Updated 6 months ago
- Solve puzzles. Learn CUDA.☆64Updated last year
- Automatic gradient descent☆215Updated 2 years ago
- Symbolic API for model creation in PyTorch.☆68Updated 8 months ago
- Serialize JAX, Flax, Haiku, or Objax model params with 🤗`safetensors`☆47Updated last year
- Unofficial JAX implementations of deep learning research papers☆159Updated 3 years ago
- Neural Networks for JAX☆84Updated last year
- Run PyTorch in JAX. 🤝☆307Updated last month
- ☆138Updated last year
- Documented and Unit Tested educational Deep Learning framework with Autograd from scratch.☆122Updated last year
- Deep learning library implemented from scratch in numpy. Mixtral, Mamba, LLaMA, GPT, ResNet, and other experiments.☆53Updated last year
- An interactive exploration of Transformer programming.☆270Updated 2 years ago
- Modular, scalable library to train ML models☆176Updated this week
- ☆285Updated last year
- A Python package of computer vision models for the Equinox ecosystem.☆110Updated last year
- ☆210Updated last year
- A tool to analyze and debug neural networks in pytorch. Use a GUI to traverse the computation graph and view the data from many different…☆293Updated 11 months ago
- Resources from the EleutherAI Math Reading Group☆54Updated 9 months ago
- A user-friendly tool chain that enables the seamless execution of ONNX models using JAX as the backend.☆125Updated 2 months ago
- A functional training loops library for JAX☆88Updated last year
- Various handy scripts to quickly setup new Linux and Windows sandboxes, containers and WSL.☆40Updated 2 weeks ago
- JAX Implementation of Black Forest Labs' Flux.1 family of models☆39Updated last week
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆181Updated 6 months ago
- ☆460Updated last year
- Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation precondition…☆188Updated last month
- Implementation of Flash Attention in Jax☆222Updated last year
- Code implementing "Efficient Parallelization of a Ubiquitious Sequential Computation" (Heinsen, 2023)☆97Updated 11 months ago
- Jax/Flax rewrite of Karpathy's nanoGPT☆62Updated 2 years ago