EricLBuehler / candle-vllm

Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.
306Updated this week

Alternatives and similar repositories for candle-vllm:

Users that are interested in candle-vllm are comparing it to the libraries listed below