yazon / flexllamaView on GitHub
๐Ÿš€ FlexLLama - Lightweight self-hosted tool for running multiple llama.cpp server instances with OpenAI v1 API compatibility and multi-GPU support
โ˜†50Feb 17, 2026Updated last week

Alternatives and similar repositories for flexllama

Users that are interested in flexllama are comparing it to the libraries listed below

Sorting:

Are these results useful?