openvinotoolkit / openvino_testdriveLinks
With OpenVINO Test Drive, users can run large language models (LLMs) and models trained by Intel Geti on their devices, including AI PCs and Edge devices.
☆30Updated this week
Alternatives and similar repositories for openvino_testdrive
Users that are interested in openvino_testdrive are comparing it to the libraries listed below
Sorting:
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆323Updated last week
- OpenVINO Tokenizers extension☆40Updated this week
- A curated list of OpenVINO based AI projects☆149Updated last month
- The framework to generate a Dockerfile, build, test, and deploy a docker image with OpenVINO™ toolkit.☆67Updated last month
- Repository for OpenVINO's extra modules☆137Updated 2 weeks ago
- oneAPI Deep Neural Network Library (oneDNN)☆20Updated this week
- A Toolkit to Help Optimize Onnx Model☆198Updated this week
- A scalable inference server for models optimized with OpenVINO™☆752Updated this week
- Pre-built components and code samples to help you build and deploy production-grade AI applications with the OpenVINO™ Toolkit from Intel☆164Updated last week
- 🤗 Optimum Intel: Accelerate inference with Intel optimization tools☆485Updated last week
- Software Development Kit (SDK) for the Intel® Geti™ platform for Computer Vision AI model training.☆119Updated this week
- OpenVINO™ Explainable AI (XAI) Toolkit: Visual Explanation for OpenVINO Models☆32Updated 5 months ago
- Build computer vision models in a fraction of the time and with less data.☆363Updated this week
- Implementation of yolo v10 in c++ std 17 over opencv and onnxruntime☆89Updated 11 months ago
- ONNX Runtime: cross-platform, high performance scoring engine for ML models☆70Updated this week
- No-code CLI designed for accelerating ONNX workflows☆208Updated 2 months ago
- Use safetensors with ONNX 🤗☆69Updated last month
- The Qualcomm® AI Hub apps are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.) a…☆270Updated 3 weeks ago
- The Qualcomm® AI Hub Models are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.)…☆773Updated this week
- Intel® NPU Acceleration Library☆688Updated 4 months ago
- Python scripts for the Segment Anythin 2 (SAM2) model in ONNX☆262Updated last year
- This repository provides optical character detection and recognition solution optimized on Nvidia devices.☆77Updated 3 months ago
- DL Streamer is now part of Open Edge Platform, for latest updates and releases please visit new repo: https://github.com/open-edge-platfo…☆564Updated last month
- Common utilities for ONNX converters☆277Updated this week
- ONNX Script enables developers to naturally author ONNX functions and models using a subset of Python.☆376Updated last week
- A Toolkit to Help Optimize Large Onnx Model☆158Updated last year
- LiteRT continues the legacy of TensorFlow Lite as the trusted, high-performance runtime for on-device AI. Now with LiteRT Next, we're exp…☆737Updated this week
- A project demonstrating how to make DeepStream docker images.☆82Updated 8 months ago
- ☆99Updated 11 months ago
- OpenVINO™ integration with TensorFlow☆179Updated last year