openvinotoolkit / openvino_contribLinks
Repository for OpenVINO's extra modules
☆155Updated 2 weeks ago
Alternatives and similar repositories for openvino_contrib
Users that are interested in openvino_contrib are comparing it to the libraries listed below
Sorting:
- The framework to generate a Dockerfile, build, test, and deploy a docker image with OpenVINO™ toolkit.☆68Updated 3 weeks ago
- ONNX Runtime: cross-platform, high performance scoring engine for ML models☆75Updated last week
- OpenVINO™ integration with TensorFlow☆178Updated last year
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆403Updated this week
- OpenVINO Tokenizers extension☆44Updated last week
- A scalable inference server for models optimized with OpenVINO™☆804Updated last week
- onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime☆431Updated last week
- Deep Learning Inference benchmark. Supports OpenVINO™ toolkit, TensorFlow, TensorFlow Lite, ONNX Runtime, OpenCV DNN, MXNet, PyTorch, Apa…☆33Updated 3 weeks ago
- Common utilities for ONNX converters☆289Updated last week
- OpenVINO ARM64 Notes☆49Updated 5 years ago
- This repository provides optical character detection and recognition solution optimized on Nvidia devices.☆85Updated 7 months ago
- AI-related samples made available by the DevTech ProViz team☆32Updated last year
- Deep Learning Streamer (DL Streamer) Pipeline Framework is an open-source streaming media analytics framework, based on GStreamer* multim…☆567Updated this week
- A Toolkit to Help Optimize Onnx Model☆280Updated last week
- MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into …☆206Updated last week
- OpenVINO™ Explainable AI (XAI) Toolkit: Visual Explanation for OpenVINO Models☆33Updated 3 months ago
- 🤗 Optimum Intel: Accelerate inference with Intel optimization tools☆518Updated last week
- A project demonstrating how to make DeepStream docker images.☆92Updated 2 months ago
- Model compression for ONNX☆99Updated last year
- An open source light-weight and high performance inference framework for Hailo devices☆150Updated last month
- C++ Helper Class for Deep Learning Inference Frameworks: TensorFlow Lite, TensorRT, OpenCV, OpenVINO, ncnn, MNN, SNPE, Arm NN, NNabla, ON…☆299Updated 3 years ago
- An example of using DeepStream SDK for redaction☆211Updated last year
- Pre-built components and code samples to help you build and deploy production-grade AI applications with the OpenVINO™ Toolkit from Intel☆195Updated this week
- A curated list of OpenVINO based AI projects☆176Updated 5 months ago
- ONNX Script enables developers to naturally author ONNX functions and models using a subset of Python.☆414Updated this week
- OpenVINO backend for Triton.☆36Updated 2 weeks ago
- High-performance, optimized pre-trained template AI application pipelines for systems using Hailo devices☆174Updated this week
- oneAPI Deep Neural Network Library (oneDNN)☆22Updated this week
- ☆106Updated 2 months ago
- C++ API for ML inferencing and transfer-learning on Coral devices☆96Updated last year