erans / selfhostllmView on GitHub
A web-based calculator for estimating GPU memory requirements and maximum concurrent requests for self-hosted LLM inference.
44Feb 25, 2026Updated last month

Alternatives and similar repositories for selfhostllm

Users that are interested in selfhostllm are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?