triton-inference-server / vllm_backend
☆237Updated last week
Alternatives and similar repositories for vllm_backend:
Users that are interested in vllm_backend are comparing it to the libraries listed below
- OpenAI compatible API for TensorRT LLM triton backend