Lightweight inference library for ONNX files, written in C++. It can run Stable Diffusion XL 1.0 on a RPI Zero 2 (or in 298MB of RAM) but also Mistral 7B on desktops and servers. ARM, x86, WASM, RISC-V supported. Accelerated by XNNPACK. Python, C# and JS(WASM) bindings available.
☆2,054Jan 20, 2026Updated 2 months ago
Alternatives and similar repositories for OnnxStream
Users that are interested in OnnxStream are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Diffusion model(SD,Flux,Wan,Qwen Image,Z-Image,...) inference in pure C/C++☆5,682Apr 1, 2026Updated last week
- Llama 2 Everywhere (L2E)☆1,528Aug 27, 2025Updated 7 months ago
- ☆1,274Oct 24, 2023Updated 2 years ago
- Tiny Dream - An embedded, Header Only, Stable Diffusion C++ implementation☆265Oct 31, 2023Updated 2 years ago
- Distribute and run LLMs with a single file.☆24,000Apr 2, 2026Updated last week
- Managed Database hosting by DigitalOcean • AdPostgreSQL, MySQL, MongoDB, Kafka, Valkey, and OpenSearch available. Automatically scale up storage and focus on building your apps.
- This repository contains a pure C++ ONNX implementation of multiple offline AI models, such as StableDiffusion (1.5 and XL), ControlNet, …☆631May 29, 2025Updated 10 months ago
- Tensor library for machine learning☆14,394Updated this week
- Fast stable diffusion on CPU and AI PC☆2,031Jan 10, 2026Updated 3 months ago