waybarrios / vllm-mlxView on GitHub
OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batching, MCP tool calling, and multimodal support. Native MLX backend, 400+ tok/s. Works with Claude Code.
780Apr 1, 2026Updated last week

Alternatives and similar repositories for vllm-mlx

Users that are interested in vllm-mlx are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?