PlugOvr-ai / PlugOvrLinks
AI Assistant
☆20Updated 7 months ago
Alternatives and similar repositories for PlugOvr
Users that are interested in PlugOvr are comparing it to the libraries listed below
Sorting:
- Yet another `llama.cpp` Rust wrapper☆12Updated last year
- Super-simple, fully Rust powered "memory" (doc store + semantic search) for LLM projects, semantic search, etc.☆62Updated 2 years ago
- An educational Rust project for exporting and running inference on Qwen3 LLM family☆35Updated 4 months ago
- Light WebUI for lm.rs☆24Updated last year
- ☆22Updated 10 months ago
- A simple, CUDA or CPU powered, library for creating vector embeddings using Candle and models from Hugging Face☆46Updated last year
- git-like rag pipeline☆250Updated 2 weeks ago
- VT Code - Semantic coding agent in the terminal☆283Updated this week
- *NIX SHELL with Local AI/LLM integration☆24Updated 9 months ago
- George is an API leveraging AI to make it easy to control a computer with natural language.☆48Updated 11 months ago
- An fully autonomous agent that accesses the browser and performs tasks.☆17Updated 7 months ago
- The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from Probabilistic LLM Vibes☆239Updated 4 months ago
- Library for doing RAG☆78Updated last week
- Kheish: A multi-role LLM agent for tasks like code auditing, file searching, and more seamlessly leveraging RAG and extensible modules.☆141Updated 11 months ago
- Built for demanding AI workflows, this gateway offers low-latency, provider-agnostic access, ensuring your AI applications run smoothly a…☆84Updated 6 months ago
- ☆17Updated 3 months ago
- An OpenVoice-based voice cloning tool, single executable file (~14M), supporting multiple formats without dependencies on ffmpeg, Python,…☆38Updated 3 months ago
- native OCR for MacOS, Windows, Linux☆195Updated last month
- Use multiple LLM backends in a single crate, simple builder-based configuration, and built-in prompt chaining & templating.☆138Updated 6 months ago
- Friendly interface to chat with an Ollama instance.