openvinotoolkit / openvino_contribLinks
Repository for OpenVINO's extra modules
☆163Updated this week
Alternatives and similar repositories for openvino_contrib
Users that are interested in openvino_contrib are comparing it to the libraries listed below
Sorting:
- The framework to generate a Dockerfile, build, test, and deploy a docker image with OpenVINO™ toolkit.☆71Updated 2 weeks ago
- ONNX Runtime: cross-platform, high performance scoring engine for ML models☆78Updated this week
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆433Updated this week
- OpenVINO Tokenizers extension☆48Updated this week
- OpenVINO™ integration with TensorFlow☆180Updated last year
- A scalable inference server for models optimized with OpenVINO™☆823Updated last week
- Deep Learning Inference benchmark. Supports OpenVINO™ toolkit, TensorFlow, TensorFlow Lite, ONNX Runtime, OpenCV DNN, MXNet, PyTorch, Apa…☆35Updated this week
- MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into …☆208Updated this week
- A project demonstrating how to make DeepStream docker images.☆94Updated 4 months ago
- Common utilities for ONNX converters☆294Updated last month
- OpenVINO™ Explainable AI (XAI) Toolkit: Visual Explanation for OpenVINO Models☆36Updated 4 months ago
- An open source light-weight and high performance inference framework for Hailo devices☆164Updated last week
- High-performance, optimized pre-trained template AI application pipelines for systems using Hailo devices☆177Updated last month
- This repository provides optical character detection and recognition solution optimized on Nvidia devices.☆87Updated 8 months ago
- Deep Learning Streamer (DL Streamer) Pipeline Framework is an open-source streaming media analytics framework, based on GStreamer* multim…☆582Updated last week
- C++ Helper Class for Deep Learning Inference Frameworks: TensorFlow Lite, TensorRT, OpenCV, OpenVINO, ncnn, MNN, SNPE, Arm NN, NNabla, ON…☆298Updated 3 years ago
- 🤗 Optimum Intel: Accelerate inference with Intel optimization tools☆532Updated last week
- Model compression for ONNX☆98Updated last year
- AI-related samples made available by the DevTech ProViz team☆33Updated last year
- OpenVINO backend for Triton.☆37Updated this week
- An example of using DeepStream SDK for redaction☆211Updated last year
- Create a concurrent video analysis pipeline featuring multistream face and human pose detection, vehicle attribute detection, and the abi…☆52Updated last year
- ☆64Updated last year
- OpenVX sample implementation☆148Updated last year
- onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime☆441Updated last week
- A project demonstrating how to use nvmetamux to run multiple models in parallel.☆112Updated last year
- Sample apps to demonstrate how to deploy models trained with TAO on DeepStream☆440Updated 3 months ago
- With OpenVINO Test Drive, users can run large language models (LLMs) and models trained by Intel Geti on their devices, including AI PCs …☆37Updated last month
- oneAPI Deep Neural Network Library (oneDNN)☆22Updated this week
- ☆107Updated 3 months ago