triton-inference-server / openvino_backendLinks
OpenVINO backend for Triton.
☆36Updated last week
Alternatives and similar repositories for openvino_backend
Users that are interested in openvino_backend are comparing it to the libraries listed below
Sorting:
- The Triton backend for the ONNX Runtime.☆171Updated last week
- ☆130Updated last week
- The Triton backend for the PyTorch TorchScript models.☆171Updated last week
- Common source, scripts and utilities for creating Triton backends.☆366Updated last week
- The Triton backend for TensorRT.☆84Updated this week
- Triton CLI is an open source command line interface that enables users to create, deploy, and profile models served by the Triton Inferen…☆73Updated last week
- Common source, scripts and utilities shared across all Triton repositories.☆79Updated last week
- The Triton backend for TensorFlow.☆57Updated 2 months ago
- Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Serv…