openvinotoolkit / openvinoLinks
OpenVINO™ is an open source toolkit for optimizing and deploying AI inference
☆9,670Updated this week
Alternatives and similar repositories for openvino
Users that are interested in openvino are comparing it to the libraries listed below
Sorting:
- 📚 Jupyter notebook tutorials for OpenVINO™☆3,029Updated last week
- Pre-trained Deep Learning models and demos (high quality and extremely fast)☆4,351Updated last week
- oneAPI Deep Neural Network Library (oneDNN)☆3,960Updated this week
- NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source compone…☆12,672Updated this week
- Neural Network Compression Framework for enhanced OpenVINO™ inference☆1,123Updated this week
- Open Machine Learning Compiler Framework☆13,096Updated this week
- The Triton Inference Server provides an optimized cloud and edge inferencing solution.☆10,334Updated this week
- Simplify your onnx model☆4,288Updated last week
- PyTorch/TorchScript/FX compiler for NVIDIA GPUs using TensorRT☆2,939Updated this week
- ⚠️DirectML is in maintenance mode ⚠️ DirectML is a high-performance, hardware-accelerated DirectX 12 library for machine learning. Direct…☆2,545Updated this week
- TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is …☆4,619Updated 9 months ago
- ONNX-TensorRT: TensorRT backend for ONNX☆3,184Updated last week
- Convert TensorFlow, Keras, Tensorflow.js and Tflite models to ONNX☆2,514Updated 4 months ago
- ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator☆19,207Updated this week
- A GPU-accelerated library containing highly optimized building blocks and an execution engine for data processing to accelerate deep lear…☆5,619Updated this week
- Olive: Simplify ML Model Finetuning, Conversion, Quantization, and Optimization for CPUs, GPUs and NPUs.☆2,249Updated this week
- CV-CUDA™ is an open-source, GPU accelerated library for cloud-scale image processing and computer vision.☆2,647Updated 2 weeks ago
- AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.☆2,552Updated this week
- Annotate better with CVAT, the industry-leading data engine for machine learning. Used and trusted by teams at any scale, for data of any…☆15,266Updated this week
- A Python package for extending the official PyTorch that can easily obtain performance on Intel platform☆2,010Updated this week
- Open standard for machine learning interoperability☆20,295Updated this week
- A scalable inference server for models optimized with OpenVINO™☆823Updated this week
- A collection of pre-trained, state-of-the-art models in the ONNX format☆9,394Updated 4 months ago
- Development repository for the Triton language and compiler☆18,387Updated this week
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆428Updated this week
- High-efficiency floating-point neural network inference operators for mobile, server, and Web☆2,245Updated this week
- MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.☆4,671Updated last year
- Examples for using ONNX Runtime for machine learning inferencing.☆1,601Updated this week
- ncnn is a high-performance neural network inference framework optimized for the mobile platform☆22,744Updated this week
- Tutorials for creating and using ONNX models☆3,657Updated last year