erans / selfhostllmView on GitHub
A web-based calculator for estimating GPU memory requirements and maximum concurrent requests for self-hosted LLM inference.
36Feb 25, 2026Updated this week

Alternatives and similar repositories for selfhostllm

Users that are interested in selfhostllm are comparing it to the libraries listed below

Sorting:

Are these results useful?