minitorch / Module-0Links
Module 0 - Fundamentals
☆106Updated last year
Alternatives and similar repositories for Module-0
Users that are interested in Module-0 are comparing it to the libraries listed below
Sorting:
- MinT: Minimal Transformer Library and Tutorials☆257Updated 3 years ago
- Docs☆143Updated 9 months ago
- A pure-functional implementation of a machine learning transformer model in Python/JAX☆179Updated 4 months ago
- An interactive exploration of Transformer programming.☆269Updated last year
- Functional local implementations of main model parallelism approaches☆96Updated 2 years ago
- Resources from the EleutherAI Math Reading Group☆54Updated 6 months ago
- A case study of efficient training of large language models using commodity hardware.☆68Updated 3 years ago
- Annotations of the interesting ML papers I read☆251Updated last month
- HomebrewNLP in JAX flavour for maintable TPU-Training☆50Updated last year
- ☆104Updated 4 years ago
- A library to create and manage configuration files, especially for machine learning projects.☆79Updated 3 years ago
- Functional deep learning☆108Updated 2 years ago
- ☆108Updated 2 years ago
- Silly twitter torch implementations.☆46Updated 2 years ago
- Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale, TACL (2022)☆129Updated 2 months ago
- The "tl;dr" on a few notable transformer papers (pre-2022).☆190Updated 2 years ago
- Framework-agnostic library for checking array/tensor shapes at runtime.☆46Updated 4 years ago
- HetSeq: Distributed GPU Training on Heterogeneous Infrastructure☆106Updated 2 years ago
- Check if you have training samples in your test set☆64Updated 3 years ago
- ☆453Updated 10 months ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆188Updated 3 years ago
- A minimal implementation of autograd (in pure Python) 🍰☆95Updated 4 years ago
- ☆67Updated 3 years ago
- ☆166Updated 2 years ago
- An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers"☆320Updated last year
- A walkthrough of transformer architecture code☆354Updated last year
- ☆153Updated 5 years ago
- Visualising the Transformer encoder☆111Updated 4 years ago
- Serialize JAX, Flax, Haiku, or Objax model params with 🤗`safetensors`☆45Updated last year
- Seminar on Large Language Models (COMP790-101 at UNC Chapel Hill, Fall 2022)☆311Updated 2 years ago