pranftw / neograd
A deep learning framework created from scratch with Python and NumPy
☆236Updated 2 years ago
Alternatives and similar repositories for neograd:
Users that are interested in neograd are comparing it to the libraries listed below
- Documented and Unit Tested educational Deep Learning framework with Autograd from scratch.☆111Updated last year
- Minimal example scripts of the Hugging Face Trainer, focused on staying under 150 lines☆198Updated 11 months ago
- Execute a jupyter notebook, fast, without needing jupyter☆131Updated 3 weeks ago
- All about the fundamental blocks of TF and JAX!☆274Updated 3 years ago
- Outlining techniques for improving the training performance of your PyTorch model without compromising its accuracy☆127Updated 2 years ago
- For optimization algorithm research and development.☆508Updated this week
- ☆163Updated 11 months ago
- MinT: Minimal Transformer Library and Tutorials☆254Updated 2 years ago
- A really tiny autograd engine☆92Updated last year
- A tool to analyze and debug neural networks in pytorch. Use a GUI to traverse the computation graph and view the data from many different…☆286Updated 4 months ago
- A pure NumPy implementation of Mamba.☆222Updated 9 months ago
- ☆428Updated 6 months ago
- A curated list of awesome fastai projects/blog posts/tutorials/etc.☆169Updated 3 years ago
- A Jax-based library for designing and training small transformers.☆286Updated 7 months ago
- Simple Byte pair Encoding mechanism used for tokenization process . written purely in C☆129Updated 5 months ago
- An interactive HTML pretty-printer for machine learning research in IPython notebooks.☆407Updated this week
- Highly commented implementations of Transformers in PyTorch☆136Updated last year
- ☆143Updated 2 years ago
- Clean up the public namespace of your package!☆55Updated this week
- An implementation of the transformer architecture onto an Nvidia CUDA kernel☆179Updated last year
- Implementation of Diffusion Transformer (DiT) in JAX☆272Updated 10 months ago
- A subset of PyTorch's neural network modules, written in Python using OpenAI's Triton.☆534Updated this week
- Display progress as a pretty table in the command line.☆133Updated last month
- Quadra: Effortless and reproducible deep learning workflows with configuration files.☆49Updated this week
- Recreating PyTorch from scratch (C/C++, CUDA, NCCL and Python, with multi-GPU support and automatic differentiation!)☆150Updated 10 months ago
- Implementation of Flash Attention in Jax☆206Updated last year
- Zero to Hero GPU and CUDA for Maths & ML tutorials with examples.☆182Updated last week
- Solve puzzles. Learn CUDA.☆63Updated last year
- Minimalist deep learning library written from scratch in Python, using NumPy/CuPy.☆121Updated 2 years ago
- Install PyTorch distributions with computation backend auto-detection☆251Updated 4 months ago