mistralai / vllm-release

A high-throughput and memory-efficient inference and serving engine for LLMs
51Updated last year

Alternatives and similar repositories for vllm-release:

Users that are interested in vllm-release are comparing it to the libraries listed below