wangcx18 / llm-vscode-inference-server

An endpoint server for efficiently serving quantized open-source LLMs for code.
54Updated last year

Alternatives and similar repositories for llm-vscode-inference-server:

Users that are interested in llm-vscode-inference-server are comparing it to the libraries listed below