An Affordable LLM Pre-training Benchmark via Accurate Loss Prediction across Scales
☆16Jun 6, 2024Updated last year
Alternatives and similar repositories for nanoLM
Users that are interested in nanoLM are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- ☆18Sep 5, 2024Updated last year
- Masked Structural Growth for 2x Faster Language Model Pre-training☆25Apr 28, 2024Updated 2 years ago
- JAX implementation of GPTQ quantization algorithm☆10Jul 19, 2023Updated 2 years ago
- This is a repository for code, data, and models associated with the paper LLM-RUBRIC: A Multidimensional, Calibrated Approach to Automate…☆30Mar 30, 2026Updated last month
- A family of efficient edge language models in 100M~1B sizes.