yazon / flexllamaView on GitHub
πŸš€ FlexLLama - Lightweight self-hosted tool for running multiple llama.cpp server instances with OpenAI v1 API compatibility and multi-GPU support
β˜†55Mar 5, 2026Updated last month

Alternatives and similar repositories for flexllama

Users that are interested in flexllama are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?