dpressel / mint
MinT: Minimal Transformer Library and Tutorials
☆253Updated 2 years ago
Alternatives and similar repositories for mint:
Users that are interested in mint are comparing it to the libraries listed below
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆177Updated last month
- A library to inspect and extract intermediate layers of PyTorch models.☆472Updated 2 years ago
- Annotations of the interesting ML papers I read☆236Updated 2 weeks ago
- Host repository for the "Reproducible Deep Learning" PhD course☆406Updated 2 years ago
- Named tensors with first-class dimensions for PyTorch☆321Updated last year
- Recipes are a standard, well supported set of blueprints for machine learning engineers to rapidly train models using the latest research…☆310Updated this week
- An interactive exploration of Transformer programming.☆261Updated last year
- Module 0 - Fundamentals☆101Updated 6 months ago
- For optimization algorithm research and development.☆498Updated this week
- FasterAI: Prune and Distill your models with FastAI and PyTorch☆247Updated last week
- Check if you have training samples in your test set☆64Updated 2 years ago
- Helps you write algorithms in PyTorch that adapt to the available (CUDA) memory☆435Updated 6 months ago
- All about the fundamentals and working of Diffusion Models☆154Updated 2 years ago
- Puzzles for exploring transformers☆333Updated last year
- ☆342Updated 11 months ago
- Highly commented implementations of Transformers in PyTorch☆132Updated last year
- An alternative to convolution in neural networks☆254Updated 11 months ago
- ☆420Updated 5 months ago
- Seminar on Large Language Models (COMP790-101 at UNC Chapel Hill, Fall 2022)☆310Updated 2 years ago
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆370Updated this week
- Repository containing code for "How to Train BERT with an Academic Budget" paper☆312Updated last year
- ☆165Updated last year
- deep learning with pytorch lightningUpdated 4 months ago
- Implementation of the Adan (ADAptive Nesterov momentum algorithm) Optimizer in Pytorch☆251Updated 2 years ago
- Outlining techniques for improving the training performance of your PyTorch model without compromising its accuracy☆126Updated last year
- Library for 8-bit optimizers and quantization routines.☆717Updated 2 years ago
- Visualising the Transformer encoder☆111Updated 4 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆187Updated 2 years ago
- Automatic gradient descent☆207Updated last year
- Resources from the EleutherAI Math Reading Group☆53Updated 3 weeks ago