WasmEdge / mediapipe-rs
The Google mediapipe AI library. Write AI inference applications for image recognition, text classification, audio / video processing and more, in Rust and run them in the secure WasmEdge sandbox. Zero Python dependency!
☆172Updated 5 months ago
Alternatives and similar repositories for mediapipe-rs:
Users that are interested in mediapipe-rs are comparing it to the libraries listed below
- Low rank adaptation (LoRA) for Candle.☆144Updated 6 months ago
- Llama2 LLM ported to Rust burn☆277Updated 10 months ago
- Rust bindings to https://github.com/k2-fsa/sherpa-onnx☆134Updated 2 weeks ago
- A Rust implementation of OpenAI's Whisper model using the burn framework☆293Updated 10 months ago
- Stable Diffusion XL ported to Rust's burn framework☆259Updated 10 months ago
- Inference Llama 2 in one file of pure Rust 🦀☆232Updated last year
- Stable Diffusion v1.4 ported to Rust's burn framework☆326Updated 5 months ago
- Rust audio/video engine☆86Updated last year
- High-level, optionally asynchronous Rust bindings to llama.cpp☆213Updated 9 months ago
- Moly: an AI LLM GUI app in pure Rust☆204Updated this week
- Models and examples built with Burn☆217Updated last week
- An LLM interface (chat bot) implemented in pure Rust using HuggingFace/Candle over Axum Websockets, an SQLite Database, and a Leptos (Was…☆128Updated 5 months ago
- ONNX neural network inference engine☆186Updated this week
- A single-binary, GPU-accelerated LLM server (HTTP and WebSocket API) written in Rust☆79Updated last year
- Lightweight HTTP servers based on hyper / warp frameworks in the WasmEdge Runtime.☆85Updated 7 months ago
- Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.