openvinotoolkit / awesome-openvino
A curated list of OpenVINO based AI projects
☆132Updated 4 months ago
Alternatives and similar repositories for awesome-openvino
Users that are interested in awesome-openvino are comparing it to the libraries listed below
Sorting:
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆274Updated this week
- With OpenVINO Test Drive, users can run large language models (LLMs) and models trained by Intel Geti on their devices, including AI PCs …☆24Updated last week
- OpenVINO Tokenizers extension☆33Updated this week
- Pre-built components and code samples to help you build and deploy production-grade AI applications with the OpenVINO™ Toolkit from Intel☆144Updated last week
- Repository for OpenVINO's extra modules☆121Updated last week
- An Awesome list of oneAPI projects☆143Updated 5 months ago
- Local LLM Server with NPU Acceleration☆180Updated last week
- 🤗 Optimum Intel: Accelerate inference with Intel optimization tools☆464Updated this week
- Build computer vision models in a fraction of the time and with less data.☆215Updated this week
- OpenVINO™ integration with TensorFlow☆179Updated 10 months ago
- Software Development Kit (SDK) for the Intel® Geti™ platform for Computer Vision AI model training.☆106Updated this week
- This repository contains Dockerfiles, scripts, yaml files, Helm charts, etc. used to scale out AI containers with versions of TensorFlow …☆45Updated this week
- Developer kits reference setup scripts for various kinds of Intel platforms and GPUs☆24Updated this week
- An innovative library for efficient LLM inference via low-bit quantization☆350Updated 8 months ago
- A scalable inference server for models optimized with OpenVINO™☆723Updated this week
- The framework to generate a Dockerfile, build, test, and deploy a docker image with OpenVINO™ toolkit.☆65Updated 3 weeks ago
- High-performance, optimized pre-trained template AI application pipelines for systems using Hailo devices☆136Updated last month
- OpenVINO Intel NPU Compiler☆50Updated this week
- Optimized local inference for LLMs with HuggingFace-like APIs for quantization, vision/language models, multimodal agents, speech, vector…☆263Updated 6 months ago
- Tools for easier OpenVINO development/debugging☆9Updated last month
- The NVIDIA RTX™ AI Toolkit is a suite of tools and SDKs for Windows developers to customize, optimize, and deploy AI models across RTX PC…☆151Updated 5 months ago
- Qualcomm Cloud AI SDK (Platform and Apps) enable high performance deep learning inference on Qualcomm Cloud AI platforms delivering high …☆60Updated 6 months ago
- Official repository of the Intel Certified Developer Program☆85Updated last week
- ☆108Updated last month
- Generative AI extensions for onnxruntime☆710Updated this week
- Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)☆186Updated this week
- ☆106Updated 3 weeks ago
- An open source light-weight and high performance inference framework for Hailo devices☆108Updated last month
- This reference can be used with any existing OpenAI integrated apps to run with TRT-LLM inference locally on GeForce GPU on Windows inste…☆120Updated last year
- Intel® NPU Acceleration Library☆671Updated 3 weeks ago