dpressel / mintLinks
MinT: Minimal Transformer Library and Tutorials
☆260Updated 3 years ago
Alternatives and similar repositories for mint
Users that are interested in mint are comparing it to the libraries listed below
Sorting:
- All about the fundamental blocks of TF and JAX!☆277Updated 4 years ago
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆181Updated 9 months ago
- A library to inspect and extract intermediate layers of PyTorch models.☆476Updated 3 years ago
- Annotations of the interesting ML papers I read☆273Updated last month
- Module 0 - Fundamentals☆110Updated last year
- A walkthrough of transformer architecture code☆370Updated last year
- Functional local implementations of main model parallelism approaches☆95Updated 2 years ago
- Check if you have training samples in your test set☆64Updated 3 years ago
- An interactive exploration of Transformer programming.☆271Updated 2 years ago
- FasterAI: Prune and Distill your models with FastAI and PyTorch☆252Updated this week
- All about the fundamentals and working of Diffusion Models☆160Updated 3 years ago
- Convert scikit-learn models to PyTorch modules☆168Updated last year
- Highly commented implementations of Transformers in PyTorch☆138Updated 2 years ago
- An alternative to convolution in neural networks☆259Updated last year
- A case study of efficient training of large language models using commodity hardware.☆68Updated 3 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆190Updated 3 years ago
- Seminar on Large Language Models (COMP790-101 at UNC Chapel Hill, Fall 2022)☆313Updated 3 years ago
- Enabling easy statistical significance testing for deep neural networks.