RayFernando1337 / LLM-CalcView on GitHub
Instantly calculate the maximum size of quantized language models that can fit in your available RAM, helping you optimize your models for inference.
253Feb 22, 2026Updated 2 weeks ago

Alternatives and similar repositories for LLM-Calc

Users that are interested in LLM-Calc are comparing it to the libraries listed below

Sorting:

Are these results useful?