EricLBuehler / candle-vllm
View external linksLinks

Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.
591Jan 28, 2026Updated 2 weeks ago

Alternatives and similar repositories for candle-vllm

Users that are interested in candle-vllm are comparing it to the libraries listed below

Sorting:

Are these results useful?