dpressel / mint
MinT: Minimal Transformer Library and Tutorials
☆253Updated 2 years ago
Alternatives and similar repositories for mint:
Users that are interested in mint are comparing it to the libraries listed below
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆177Updated 2 months ago
- A library to inspect and extract intermediate layers of PyTorch models.☆472Updated 2 years ago
- A Jax-based library for designing and training small transformers.☆286Updated 7 months ago
- All about the fundamental blocks of TF and JAX!☆274Updated 3 years ago
- Module 0 - Fundamentals☆101Updated 7 months ago
- Named tensors with first-class dimensions for PyTorch☆320Updated last year
- deep learning with pytorch lightningUpdated 5 months ago
- An interactive exploration of Transformer programming.☆262Updated last year
- Check if you have training samples in your test set☆64Updated 2 years ago
- 100 exercises to learn JAX☆576Updated 2 years ago
- All about the fundamentals and working of Diffusion Models☆155Updated 2 years ago
- A Pytree Module system for Deep Learning in JAX☆214Updated 2 years ago
- Recipes are a standard, well supported set of blueprints for machine learning engineers to rapidly train models using the latest research…☆312Updated last week
- FasterAI: Prune and Distill your models with FastAI and PyTorch☆247Updated 2 weeks ago
- ☆346Updated last year
- Implementation of Flash Attention in Jax☆206Updated last year
- Seminar on Large Language Models (COMP790-101 at UNC Chapel Hill, Fall 2022)☆310Updated 2 years ago
- Unofficial JAX implementations of deep learning research papers☆155Updated 2 years ago
- Highly commented implementations of Transformers in PyTorch☆135Updated last year
- A case study of efficient training of large language models using commodity hardware.☆69Updated 2 years ago
- Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale, TACL (2022)☆125Updated 5 months ago
- Outlining techniques for improving the training performance of your PyTorch model without compromising its accuracy☆127Updated 2 years ago
- Lightning Bits: Engineering for Researchers repo☆132Updated 2 years ago
- Implicit MLE: Backpropagating Through Discrete Exponential Family Distributions☆258Updated last year
- Functional local implementations of main model parallelism approaches☆95Updated 2 years ago
- Amos optimizer with JEstimator lib.☆82Updated 11 months ago
- A walkthrough of transformer architecture code☆338Updated last year
- Sequence modeling with Mega.☆295Updated 2 years ago
- Repository containing code for "How to Train BERT with an Academic Budget" paper☆312Updated last year
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆187Updated 2 years ago