dpressel / mintLinks
MinT: Minimal Transformer Library and Tutorials
☆259Updated 3 years ago
Alternatives and similar repositories for mint
Users that are interested in mint are comparing it to the libraries listed below
Sorting:
- Module 0 - Fundamentals☆109Updated last year
- FasterAI: Prune and Distill your models with FastAI and PyTorch☆250Updated last month
- A library to inspect and extract intermediate layers of PyTorch models.☆475Updated 3 years ago
- Check if you have training samples in your test set☆64Updated 3 years ago
- A walkthrough of transformer architecture code☆371Updated last year
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆181Updated 7 months ago
- An alternative to convolution in neural networks☆258Updated last year
- Annotations of the interesting ML papers I read☆266Updated last month
- The "tl;dr" on a few notable transformer papers (pre-2022).☆190Updated 2 years ago
- Recipes are a standard, well supported set of blueprints for machine learning engineers to rapidly train models using the latest research…☆334Updated this week
- A case study of efficient training of large language models using commodity hardware.☆68Updated 3 years ago
- Docs☆143Updated last year
- All about the fundamental blocks of TF and JAX!☆276Updated 4 years ago
- Convert scikit-learn models to PyTorch modules☆168Updated last year
- Host repository for the "Reproducible Deep Learning" PhD course☆407Updated 3 years ago
- Enabling easy statistical significance testing for deep neural networks.☆338Updated last year
- Cyclemoid implementation for PyTorch☆90Updated 3 years ago
- Outlining techniques for improving the training performance of your PyTorch model without compromising its accuracy☆129Updated 2 years ago
- 100 exercises to learn JAX☆594Updated 3 years ago
- Functional deep learning☆108Updated 3 years ago
- Memory mapped numpy arrays of varying shapes☆305Updated last year
- Highly commented implementations of Transformers in PyTorch☆139Updated 2 years ago
- An interactive exploration of Transformer programming.☆270Updated 2 years ago
- Lite Inference Toolkit (LIT) for PyTorch☆161Updated 4 years ago
- Amos optimizer with JEstimator lib.☆82Updated last year
- A python package for benchmarking interpretability techniques on Transformers.☆214Updated last year
- git extension for {collaborative, communal, continual} model development☆216Updated last year
- Unofficial JAX implementations of deep learning research papers☆159Updated 3 years ago
- Original transformer paper: Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information process…☆241Updated last year
- All about the fundamentals and working of Diffusion Models☆159Updated 2 years ago