openvinotoolkit / workbenchLinks
☆28Updated 2 years ago
Alternatives and similar repositories for workbench
Users that are interested in workbench are comparing it to the libraries listed below
Sorting:
- A scalable inference server for models optimized with OpenVINO™☆752Updated this week
- Neural Network Compression Framework for enhanced OpenVINO™ inference☆1,074Updated this week
- Repository for OpenVINO's extra modules☆137Updated 2 weeks ago
- OpenVINO™ integration with TensorFlow☆179Updated last year
- DL Streamer is now part of Open Edge Platform, for latest updates and releases please visit new repo: https://github.com/open-edge-platfo…☆564Updated last month
- The framework to generate a Dockerfile, build, test, and deploy a docker image with OpenVINO™ toolkit.☆67Updated last month
- OpenVINO™ Explainable AI (XAI) Toolkit: Visual Explanation for OpenVINO Models☆32Updated 5 months ago
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆323Updated last week
- OpenVINO Tokenizers extension☆40Updated this week
- Intel® AI Reference Models: contains Intel optimizations for running deep learning workloads on Intel® Xeon® Scalable processors and Inte…☆717Updated 2 weeks ago
- SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX R…☆2,482Updated this week
- Dataset Management Framework, a Python library and a CLI tool to build, analyze and manage Computer Vision datasets.☆640Updated this week
- Software Development Kit (SDK) for the Intel® Geti™ platform for Computer Vision AI model training.☆119Updated this week
- With OpenVINO Test Drive, users can run large language models (LLMs) and models trained by Intel Geti on their devices, including AI PCs …☆30Updated this week
- A curated list of OpenVINO based AI projects☆149Updated 2 months ago
- 🤗 Optimum Intel: Accelerate inference with Intel optimization tools☆485Updated last week
- Sample apps to demonstrate how to deploy models trained with TAO on DeepStream☆424Updated 6 months ago
- OpenVINO™ is an open source toolkit for optimizing and deploying AI inference☆8,740Updated this week
- A Python package for extending the official PyTorch that can easily obtain performance on Intel platform☆1,940Updated this week
- onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime☆409Updated 2 weeks ago
- ONNX Runtime: cross-platform, high performance scoring engine for ML models☆70Updated last week
- ONNX Optimizer☆745Updated 3 weeks ago
- Reference implementations of MLPerf™ inference benchmarks☆1,443Updated last week
- ONNX Script enables developers to naturally author ONNX functions and models using a subset of Python.☆376Updated last week
- Common utilities for ONNX converters☆277Updated this week
- This script converts the ONNX/OpenVINO IR model to Tensorflow's saved_model, tflite, h5, tfjs, tftrt(TensorRT), CoreML, EdgeTPU, ONNX and…☆343Updated 2 years ago
- Pre-built components and code samples to help you build and deploy production-grade AI applications with the OpenVINO™ Toolkit from Intel☆164Updated last week
- 📚 Jupyter notebook tutorials for OpenVINO™☆2,882Updated last week
- DeepStream SDK Python bindings and sample applications☆1,677Updated 10 months ago
- Triton Python, C++ and Java client libraries, and GRPC-generated client examples for go, java and scala.☆641Updated last week