second-state / WasmEdge-WASINN-examplesLinks
☆255Updated last month
Alternatives and similar repositories for WasmEdge-WASINN-examples
Users that are interested in WasmEdge-WASINN-examples are comparing it to the libraries listed below
Sorting:
- Neural Network proposal for WASI☆521Updated 11 months ago
- OpenAI compatible API for serving LLAMA-2 model☆218Updated 2 years ago
- A cross-platform browser ML framework.☆718Updated 11 months ago
- Lightweight database clients in the WasmEdge Runtime☆71Updated last year
- Vercel and web-llm template to run wasm models directly in the browser.☆164Updated last year
- Approx nearest neighbor search in Rust☆167Updated 2 years ago
- The easiest & fastest way to run customized and fine-tuned LLMs locally or on the edge☆1,516Updated last week
- Tensor library for machine learning☆273Updated 2 years ago
- The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from Probabilistic LLM Vibes☆236Updated 2 months ago
- Web-optimized vector database (written in Rust).☆256Updated 7 months ago
- ☆139Updated last year
- The Google mediapipe AI library. Write AI inference applications for image recognition, text classification, audio / video processing and…☆210Updated last year
- LLM Orchestrator built in Rust☆283Updated last year
- Continuous runtime observablity SDKs to monitor WebAssembly code.☆179Updated last year
- xet client tech, used in huggingface_hub☆302Updated this week
- A Rust library for using stable diffusion functions when the Wasi is being executed on WasmEdge.☆12Updated 11 months ago
- Inference Llama 2 in one file of pure Rust 🦀☆233Updated 2 years ago
- A single-binary, GPU-accelerated LLM server (HTTP and WebSocket API) written in Rust☆79Updated last year
- The easiest & fastest way to run customized and fine-tuned LLMs locally or on the edge☆22Updated 7 months ago
- JS tokenizer for LLaMA 1 and 2☆360Updated last year
- Rust framework for LLM orchestration☆203Updated last year
- 🦀Rust + Large Language Models - Make AI Services Freely and Easily.☆183Updated last year
- A template project for building a database-driven microservice in Rust and run it in the WasmEdge sandbox.☆371Updated 6 months ago
- WebAssembly (Wasm) Build and Bindings for llama.cpp☆283Updated last year
- Simple Rust applications that run in WasmEdge☆33Updated 2 years ago
- Run any ML model from any programming language.☆424Updated last year
- LLama.cpp rust bindings☆407Updated last year
- Minimal LLM inference in Rust☆1,013Updated last year
- In-memory vector store with efficient read and write performance for semantic caching and retrieval system. Redis for Semantic Caching.☆372Updated 10 months ago
- Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.☆496Updated this week