trymirai / uzuLinks
A high-performance inference engine for AI models
β1,383Updated this week
Alternatives and similar repositories for uzu
Users that are interested in uzu are comparing it to the libraries listed below
Sorting:
- Open-source LLM load balancer and serving platform for self-hosting LLMs at scale ππ¦β1,379Updated last week
- High performance Rust stream processing engine seamlessly integrates AI capabilities, providing powerful real-time data processing and inβ¦β1,218Updated this week
- Minimal LLM inference in Rustβ1,025Updated last year
- Your filesystem as a vector databaseβ496Updated 7 months ago
- Artificial Neural Engine Machine Learning Libraryβ1,263Updated 2 weeks ago
- A robust message queue system for Rust applications, designed as a Rust alternative to Celery.β591Updated 4 months ago
- SeekStorm - sub-millisecond full-text search library & multi-tenancy server in Rustβ1,782Updated this week
- Diagram as Code Tool Written in Rust with Draggable Editingβ2,048Updated last week
- Run larger LLMs with longer contexts on Apple Silicon by using differentiated precision for KV cache quantization. KVSplit enables 8-bit β¦β361Updated 6 months ago
- High-Performance Implementation of OpenAI's TikToken.β462Updated 5 months ago
- Fast, streaming indexing, query, and agentic LLM applications in Rustβ618Updated last week
- A powerful document AI question-answering tool that connects to your local Ollama models. Create, manage, and interact with RAG systems fβ¦β1,086Updated 3 months ago
- Local first semantic and hybrid BM25 grep / search tool for use by AI and humans!β1,026Updated 2 weeks ago
- Rust based package manager for macOSβ1,857Updated 3 months ago
- Modern, fast, document parser written in π¦β527Updated last month
- Fully open-source command-line AI assistant inspired by OpenAI Codex, supporting local language models.β652Updated 4 months ago
- Demo project showing a single Rust codebase running on CPU and directly on GPUsβ461Updated 3 months ago
- HTTP(s) request filter for processesβ799Updated 3 weeks ago
- π¦οΈ A fast, secure MCP server that extends its capabilities through WebAssembly plugins.β827Updated this week
- MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. Iβ¦β603Updated last month
- Onit MacOS clientβ1,007Updated 3 months ago
- HelixDB is an open-source graph-vector database built from scratch in Rust.β3,432Updated this week
- A fast, minimalist directory tree viewer, written in Rust.β1,380Updated 2 months ago
- Unofficial Rust bindings to Apple's mlx frameworkβ211Updated this week
- A Positively Charged JavaScript Runtimeβ329Updated last month
- A human-friendly alternative to netstat for socket and port monitoring on Linux and macOS.β2,501Updated last week
- An transformer based LLM. Written completely in Rustβ2,977Updated last month
- An open-source framework for verifiably private AI inferenceβ884Updated this week
- High Performace IDE for Jupyter Notebooksβ2,270Updated 2 months ago
- High-performance MLX-based LLM inference engine for macOS with native Swift implementationβ436Updated 2 months ago