intel / onnxruntimeLinks
ONNX Runtime: cross-platform, high performance scoring engine for ML models
☆78Updated this week
Alternatives and similar repositories for onnxruntime
Users that are interested in onnxruntime are comparing it to the libraries listed below
Sorting:
- Repository for OpenVINO's extra modules☆162Updated last week
- The framework to generate a Dockerfile, build, test, and deploy a docker image with OpenVINO™ toolkit.☆70Updated last week
- onnxruntime-extensions: A specialized pre- and post- processing library for ONNX Runtime☆441Updated this week
- MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into …☆208Updated this week
- Common utilities for ONNX converters☆293Updated last month
- AI-related samples made available by the DevTech ProViz team☆33Updated last year
- A nvImageCodec library of GPU- and CPU- accelerated codecs featuring a unified interface☆139Updated 3 weeks ago
- OpenVINO Tokenizers extension☆48Updated last week
- Windows version of NVIDIA's NCCL ('Nickel') for multi-GPU training - please use https://github.com/NVIDIA/nccl for changes.☆61Updated 2 months ago
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆428Updated this week
- AMD's graph optimization engine.☆273Updated this week
- C++ Helper Class for Deep Learning Inference Frameworks: TensorFlow Lite, TensorRT, OpenCV, OpenVINO, ncnn, MNN, SNPE, Arm NN, NNabla, ON…☆298Updated 3 years ago
- ONNX Script enables developers to naturally author ONNX functions and models using a subset of Python.☆420Updated this week
- ONNX Runtime Inference C++ Example☆257Updated 10 months ago
- Count number of parameters / MACs / FLOPS for ONNX models.☆95Updated last year
- A scalable inference server for models optimized with OpenVINO™☆823Updated this week
- Use OpenCV API to run ONNX model by ONNXRuntime.☆23Updated last week
- Examples for using ONNX Runtime for model training.☆361Updated last year
- A Toolkit to Help Optimize Large Onnx Model☆163Updated 3 months ago
- ☆137Updated last week
- ONNX Optimizer☆795Updated last week
- Large Language Model Onnx Inference Framework☆36Updated 2 months ago
- The Triton backend for TensorRT.☆84Updated 2 weeks ago
- cudnn_frontend provides a c++ wrapper for the cudnn backend API and samples on how to use it☆682Updated last week
- The Triton backend that allows running GPU-accelerated data pre-processing pipelines implemented in DALI's python API.☆140Updated last week
- Computation using data flow graphs for scalable machine learning☆68Updated this week
- ☆43Updated 3 years ago
- Convert ANY IR to ONNX format☆24Updated 2 weeks ago
- Convert TensorFlow Lite models (*.tflite) to ONNX.☆171Updated 2 years ago
- oneAPI Deep Neural Network Library (oneDNN)☆22Updated 2 weeks ago