yazon / flexllamaLinks

🚀 FlexLLama - Lightweight self-hosted tool for running multiple llama.cpp server instances with OpenAI v1 API compatibility and multi-GPU support
19Updated last week

Alternatives and similar repositories for flexllama

Users that are interested in flexllama are comparing it to the libraries listed below

Sorting: