wangcx18 / llm-vscode-inference-serverView on GitHub
An endpoint server for efficiently serving quantized open-source LLMs for code.
58Oct 15, 2023Updated 2 years ago

Alternatives and similar repositories for llm-vscode-inference-server

Users that are interested in llm-vscode-inference-server are comparing it to the libraries listed below

Sorting:

Are these results useful?