Ashx098 / Mini-LLM
View external linksLinks

A ground-up LLM engineering project: tokenizer → architecture → training → scaling laws → inference. Starts at 80M, engineered to scale into 1B+ models with minimal changes. Clean, research-ready code for anyone serious about understanding and building LLMs from first principles.
49Jan 29, 2026Updated 2 weeks ago

Alternatives and similar repositories for Mini-LLM

Users that are interested in Mini-LLM are comparing it to the libraries listed below

Sorting:

Are these results useful?