wangcx18 / llm-vscode-inference-serverLinks

An endpoint server for efficiently serving quantized open-source LLMs for code.
57Updated 2 years ago

Alternatives and similar repositories for llm-vscode-inference-server

Users that are interested in llm-vscode-inference-server are comparing it to the libraries listed below

Sorting: