asprenger / ray_vllm_inferenceLinks

A simple service that integrates vLLM with Ray Serve for fast and scalable LLM serving.
67Updated last year

Alternatives and similar repositories for ray_vllm_inference

Users that are interested in ray_vllm_inference are comparing it to the libraries listed below

Sorting: