sanctuary-systems-com / llama_multiserverView on GitHub
A proxy that hosts multiple single-model runners such as LLama.cpp and vLLM
12May 30, 2025Updated 9 months ago

Alternatives and similar repositories for llama_multiserver

Users that are interested in llama_multiserver are comparing it to the libraries listed below

Sorting:

Are these results useful?