RoyalCities / RC-Home-Assistant-Low-VRAMLinks
Local AI voice assistant stack for Home Assistant (GPU-accelerated) with persistent memory, follow-up conversation, and Ollama model recommendations - settings designed for low VRAM systems.
☆191Updated 3 weeks ago
Alternatives and similar repositories for RC-Home-Assistant-Low-VRAM
Users that are interested in RC-Home-Assistant-Low-VRAM are comparing it to the libraries listed below
Sorting:
- ☆186Updated 4 months ago
- ☆162Updated 2 weeks ago
- ☆168Updated last week
- A lightweight recreation of OS1/Samantha from the movie Her, running locally in the browser☆108Updated last month
- Notate is a desktop chat application that takes AI conversations to the next level. It combines the simplicity of chat with advanced feat…☆256Updated 6 months ago
- llmbasedos — Local-First OS Where Your AI Agents Wake Up and Work☆271Updated this week
- OLLama IMage CAtegorizer☆68Updated 7 months ago
- Local LLM Powered Recursive Search & Smart Knowledge Explorer☆251Updated 6 months ago
- You don’t need to read the code to understand how to build!☆205Updated 7 months ago
- The PyVisionAI Official Repo☆105Updated last month
- 🗣️ Real‑time, low‑latency voice, vision, and conversational‑memory AI assistant built on LiveKit and local LLMs ✨☆88Updated 2 months ago
- ☆133Updated 2 months ago
- Use smol agents to do research and then update csv coumns with its findings.☆41Updated 6 months ago
- A web application that converts speech to speech 100% private☆75Updated 2 months ago
- A simple to use python library for creating podcasts with support for many LLM and TTS providers☆44Updated 3 weeks ago
- LLM search engine faster than perplexity!☆349Updated this week
- Welcome!☆140Updated 8 months ago
- Command-line personal assistant using your favorite proprietary or local models with access to over 30+ tools☆111Updated last month
- ☆221Updated 3 months ago
- AI creative coding studio Deepresearch , blogs , Animation all in browser full privacy.☆62Updated this week
- A sleek web interface for Ollama, making local LLM management and usage simple. WebOllama provides an intuitive UI to manage Ollama model…☆53Updated 3 months ago
- A multi-agent AI architecture that connects 25+ specialized agents through n8n and MCP servers. Project NOVA routes requests to domain-sp…☆210Updated 2 months ago
- ☆82Updated 5 months ago
- Lightweight & fast AI inference proxy for self-hosted LLMs backends like Ollama, LM Studio and others. Designed for speed, simplicity and…☆77Updated this week
- This is a cross-platform desktop application that allows you to chat with locally hosted LLMs and enjoy features like MCP support☆222Updated 2 weeks ago
- A persistent local memory for AI, LLMs, or Copilot in VS Code.☆128Updated last week
- Agent MCP for ffmpeg☆202Updated 2 months ago
- Open source LLM UI, compatible with all local LLM providers.☆174Updated 11 months ago
- A command-line personal assistant that integrates with Google Calendar, Gmail, and Tasks to help manage your digital life.☆125Updated 9 months ago
- Generates breakthrough ideas from a single prompt through an 8 stage walkthrough, with optional research proposal paper.☆56Updated 5 months ago