dpressel / mintLinks
MinT: Minimal Transformer Library and Tutorials
☆257Updated 3 years ago
Alternatives and similar repositories for mint
Users that are interested in mint are comparing it to the libraries listed below
Sorting:
- Module 0 - Fundamentals☆106Updated last year
- A library to inspect and extract intermediate layers of PyTorch models.☆473Updated 3 years ago
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆178Updated 3 months ago
- Annotations of the interesting ML papers I read☆250Updated last month
- A walkthrough of transformer architecture code☆353Updated last year
- FasterAI: Prune and Distill your models with FastAI and PyTorch☆249Updated 2 months ago
- The "tl;dr" on a few notable transformer papers (pre-2022).☆190Updated 2 years ago
- Check if you have training samples in your test set☆64Updated 3 years ago
- Enabling easy statistical significance testing for deep neural networks.☆336Updated last year
- Recipes are a standard, well supported set of blueprints for machine learning engineers to rapidly train models using the latest research…☆326Updated last week
- All about the fundamental blocks of TF and JAX!☆276Updated 3 years ago
- An interactive exploration of Transformer programming.☆269Updated last year
- An alternative to convolution in neural networks☆257Updated last year
- Highly commented implementations of Transformers in PyTorch☆136Updated 2 years ago
- Functional deep learning☆108Updated 2 years ago
- Host repository for the "Reproducible Deep Learning" PhD course☆406Updated 3 years ago
- A lightweight library designed to accelerate the process of training PyTorch models by providing a minimal, but extensible training loop …☆190Updated 2 months ago
- Interview Questions and Answers for Machine Learning Engineer role☆119Updated 3 months ago
- A case study of efficient training of large language models using commodity hardware.☆68Updated 3 years ago
- Lite Inference Toolkit (LIT) for PyTorch☆161Updated 3 years ago
- Original transformer paper: Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information process…☆240Updated last year
- Cyclemoid implementation for PyTorch☆90Updated 3 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆188Updated 3 years ago
- Docs☆143Updated 9 months ago
- Convert scikit-learn models to PyTorch modules☆164Updated last year
- All about the fundamentals and working of Diffusion Models☆159Updated 2 years ago
- Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale, TACL (2022)☆128Updated 2 months ago
- A python package for benchmarking interpretability techniques on Transformers.☆214Updated 11 months ago
- Rax is a Learning-to-Rank library written in JAX.☆331Updated last week
- HetSeq: Distributed GPU Training on Heterogeneous Infrastructure☆106Updated 2 years ago