wangcx18 / llm-vscode-inference-serverLinks

An endpoint server for efficiently serving quantized open-source LLMs for code.
55Updated last year

Alternatives and similar repositories for llm-vscode-inference-server

Users that are interested in llm-vscode-inference-server are comparing it to the libraries listed below

Sorting: