RayFernando1337 / LLM-CalcLinks

Instantly calculate the maximum size of quantized language models that can fit in your available RAM, helping you optimize your models for inference.
226Updated last month

Alternatives and similar repositories for LLM-Calc

Users that are interested in LLM-Calc are comparing it to the libraries listed below

Sorting: