erans / selfhostllmView on GitHub
A web-based calculator for estimating GPU memory requirements and maximum concurrent requests for self-hosted LLM inference.
37Feb 25, 2026Updated 3 weeks ago

Alternatives and similar repositories for selfhostllm

Users that are interested in selfhostllm are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?