openvinotoolkit / operator
OpenVINO operator for OpenShift and Kubernetes
☆14Updated 2 months ago
Related projects ⓘ
Alternatives and complementary repositories for operator
- A scalable inference server for models optimized with OpenVINO™☆674Updated this week
- Inference Model Manager for Kubernetes☆47Updated 5 years ago
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆145Updated this week
- Run cloud native workloads on NVIDIA GPUs☆133Updated last month
- A multi-user, distributed computing environment for running DL model training experiments on Intel® Xeon® Scalable processor-based system…☆393Updated 6 months ago
- The framework to generate a Dockerfile, build, test, and deploy a docker image with OpenVINO™ toolkit.☆59Updated last month
- oneAPI Collective Communications Library (oneCCL)☆201Updated this week
- Home of Intel(R) Deep Learning Streamer Pipeline Server (formerly Video Analytics Serving)☆125Updated last year
- Controller for ModelMesh☆204Updated 3 months ago
- Kubernetes Operator, ansible playbooks, and production scripts for large-scale AIStore deployments on Kubernetes.☆74Updated 3 weeks ago
- ☆18Updated this week
- Intel® AI Reference Models: contains Intel optimizations for running deep learning workloads on Intel® Xeon® Scalable processors and Inte…☆684Updated this week
- Repository for OpenVINO's extra modules☆105Updated last week
- Software Development Kit (SDK) for the Intel® Geti™ platform for Computer Vision AI model training.☆73Updated last week
- OpenVINO backend for Triton.☆29Updated this week
- Tools to deploy GPU clusters in the Cloud☆30Updated last year
- MIG Partition Editor for NVIDIA GPUs☆173Updated this week
- Unified runtime-adapter image of the sidecar containers which run in the modelmesh pods☆21Updated last month
- OpenVINO Tokenizers extension☆24Updated this week
- Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Serv…☆426Updated this week
- Triton Model Navigator is an inference toolkit designed for optimizing and deploying Deep Learning models with a focus on NVIDIA GPUs.☆183Updated 2 months ago
- ☆28Updated last year
- This repository is a home to Intel® Deep Learning Streamer (Intel® DL Streamer) Pipeline Framework. Pipeline Framework is a streaming med…☆529Updated last week
- The Triton backend for the ONNX Runtime.☆130Updated this week
- OpenVINO™ integration with TensorFlow☆178Updated 4 months ago
- Kubernetes Operator for MPI-based applications (distributed training, HPC, etc.)☆440Updated 3 weeks ago
- NVIDIA Data Center GPU Manager (DCGM) is a project for gathering telemetry and measuring the health of NVIDIA GPUs☆410Updated 2 months ago
- Computation using data flow graphs for scalable machine learning☆66Updated this week
- ☆57Updated last month
- GPU plugin to the node feature discovery for Kubernetes☆291Updated 5 months ago