RayFernando1337 / LLM-CalcView on GitHub
Instantly calculate the maximum size of quantized language models that can fit in your available RAM, helping you optimize your models for inference.
259Feb 22, 2026Updated 2 months ago

Alternatives and similar repositories for LLM-Calc

Users that are interested in LLM-Calc are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?