OpenCSGs / llm-inference

llm-inference is a platform for publishing and managing llm inference, providing a wide range of out-of-the-box features for model deployment, such as UI, RESTful API, auto-scaling, computing resource management, monitoring, and more.
74Updated 8 months ago

Alternatives and similar repositories for llm-inference:

Users that are interested in llm-inference are comparing it to the libraries listed below