trymirai / uzuLinks
A high-performance inference engine for AI models
β1,390Updated this week
Alternatives and similar repositories for uzu
Users that are interested in uzu are comparing it to the libraries listed below
Sorting:
- Minimal LLM inference in Rustβ1,025Updated last year
- Open-source LLM load balancer and serving platform for self-hosting LLMs at scale ππ¦β1,402Updated last week
- High performance Rust stream processing engine seamlessly integrates AI capabilities, providing powerful real-time data processing and inβ¦β1,236Updated this week
- High-Performance Implementation of OpenAI's TikToken.β465Updated 5 months ago
- Local first semantic and hybrid BM25 grep / search tool for use by AI and humans!β1,084Updated last month
- Artificial Neural Engine Machine Learning Libraryβ1,279Updated 3 weeks ago
- Your filesystem as a vector databaseβ496Updated 7 months ago
- Run larger LLMs with longer contexts on Apple Silicon by using differentiated precision for KV cache quantization. KVSplit enables 8-bit β¦β362Updated 7 months ago
- Onit MacOS clientβ1,014Updated 4 months ago
- MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. Iβ¦β625Updated this week
- A powerful document AI question-answering tool that connects to your local Ollama models. Create, manage, and interact with RAG systems fβ¦β1,092Updated 4 months ago
- Diagram as Code Tool Written in Rust with Draggable Editingβ2,127Updated last week
- A robust message queue system for Rust applications, designed as a Rust alternative to Celery.β593Updated 5 months ago
- An open-source framework for verifiably private AI inferenceβ895Updated last week
- Unofficial Rust bindings to Apple's mlx frameworkβ219Updated last week
- A fast, minimalist directory tree viewer, written in Rust.β1,392Updated 3 months ago
- Rust based package manager for macOSβ1,852Updated 3 months ago
- Demo project showing a single Rust codebase running on CPU and directly on GPUsβ463Updated 4 months ago
- Fully open-source command-line AI assistant inspired by OpenAI Codex, supporting local language models.β661Updated 5 months ago
- SQLite-Vector is a cross-platform, ultra-efficient SQLite extension that brings vector search capabilities to your embedded database.β463Updated last week
- Communicate with an LLM provider using a single interfaceβ1,511Updated this week
- SeekStorm - sub-millisecond full-text search library & multi-tenancy server in Rustβ1,790Updated 3 weeks ago
- Ultra-lightweight AI Agentβ421Updated 4 months ago
- Fast, streaming indexing, query, and agentic LLM applications in Rustβ628Updated last week
- β271Updated 4 months ago
- Git Based Memory Storage for Conversational AI Agentβ759Updated last month
- Embeddable library or single binary for indexing and searching 1B vectorsβ340Updated 2 weeks ago
- Persistent memory for LLMs and apps. Content-addressed storage with dedupe, compression, full-text and vector search.β358Updated this week
- git-like rag pipelineβ250Updated this week
- Rust framework for LLM orchestrationβ203Updated last year