EricLBuehler / candle-vllm

Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.
265Updated last month

Related projects

Alternatives and complementary repositories for candle-vllm