dpressel / mintLinks
MinT: Minimal Transformer Library and Tutorials
☆260Updated 3 years ago
Alternatives and similar repositories for mint
Users that are interested in mint are comparing it to the libraries listed below
Sorting:
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆181Updated 8 months ago
- Module 0 - Fundamentals☆111Updated last year
- A library to inspect and extract intermediate layers of PyTorch models.☆475Updated 3 years ago
- All about the fundamental blocks of TF and JAX!☆276Updated 4 years ago
- Annotations of the interesting ML papers I read☆272Updated 3 weeks ago
- A walkthrough of transformer architecture code☆370Updated last year
- Check if you have training samples in your test set☆64Updated 3 years ago
- FasterAI: Prune and Distill your models with FastAI and PyTorch☆252Updated this week
- All about the fundamentals and working of Diffusion Models☆159Updated 3 years ago
- Recipes are a standard, well supported set of blueprints for machine learning engineers to rapidly train models using the latest research…☆338Updated this week
- Rax is a Learning-to-Rank library written in JAX.☆336Updated 4 months ago
- Lite Inference Toolkit (LIT) for PyTorch☆160Updated 4 years ago
- Convert scikit-learn models to PyTorch modules☆168Updated last year
- Host repository for the "Reproducible Deep Learning" PhD course☆407Updated 3 years ago
- An alternative to convolution in neural networks☆259Updated last year
- A python package for benchmarking interpretability techniques on Transformers.☆215Updated last year
- The "tl;dr" on a few notable transformer papers (pre-2022).☆189Updated 3 years ago
- Functional deep learning☆108Updated 3 years ago
- Outlining techniques for improving the training performance of your PyTorch model without compromising its accuracy☆129Updated 2 years ago
- Unofficial JAX implementations of deep learning research papers☆160Updated 3 years ago
- Enabling easy statistical significance testing for deep neural networks.☆338Updated last year
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆190Updated 3 years ago
- Deep Learning project template best practices with Pytorch Lightning, Hydra, Tensorboard.☆160Updated 4 years ago
- A lightweight library designed to accelerate the process of training PyTorch models by providing a minimal, but extensible training loop …☆193Updated 3 weeks ago
- Interview Questions and Answers for Machine Learning Engineer role☆116Updated 8 months ago
- 100 exercises to learn JAX☆596Updated 3 years ago
- Amos optimizer with JEstimator lib.☆82Updated last year
- An interactive exploration of Transformer programming.☆271Updated 2 years ago
- Named tensors with first-class dimensions for PyTorch☆332Updated 2 years ago
- git extension for {collaborative, communal, continual} model development☆217Updated last year