IanGupta / KickLinks
Kick is an AI-powered assistant that provides voice and keyboard control over your Windows device, enabling seamless automation of your daily tasks.
☆16Updated 4 months ago
Alternatives and similar repositories for Kick
Users that are interested in Kick are comparing it to the libraries listed below
Sorting:
- ☆74Updated 7 months ago
- Extract data from websites in LLM ready JSON or CSV format. Crawl or Scrape entire website with Website Crawler☆71Updated 2 months ago
- A lightweight UI for chatting with Ollama models. Streaming responses, conversation history, and multi-model support.☆145Updated 9 months ago
- Belullama is a comprehensive AI application that bundles Ollama, Open WebUI, and Automatic1111 (Stable Diffusion WebUI) into a single, ea…☆192Updated 6 months ago
- Explore, Install, Innovate — in 1 Click.☆128Updated this week
- GenFilesMCP: Minimal MCP Server for Open Web UI. Generates PPT, Excel, Word, or Markdown files using user requests and full chat context.☆44Updated 2 weeks ago
- Agent MCP for ffmpeg☆211Updated 6 months ago
- reddacted lets you analyze & sanitize your online footprint using LLMs, PII detection & sentiment analysis to identify anything that migh…☆113Updated 5 months ago
- Chat with your pdf using your local LLM, OLLAMA client.☆41Updated last year
- LLm Collaboration☆12Updated last year
- pdfLLM is a completely open source, proof of concept RAG app.☆180Updated 3 months ago
- the rent a hal project for AI☆21Updated 4 months ago
- Integrates AI tools into Microsoft Word☆156Updated last year
- AI powered Chatbot with real time updates.☆68Updated last year
- Web extension to use local AI models (Ollama)☆63Updated 10 months ago
- this is a dungeon ai run locally that use your llm☆69Updated last month
- ☆99Updated 2 months ago
- PDFstract - A Conversion and OCR benchmarking solution - Soon to be a Unified Pipeline for PDF data extraction. CLI and GUI☆56Updated 3 weeks ago
- Powerful search page powered by LLMs and SearXNG☆265Updated last month
- The easiest & fastest way to run LLMs in your home lab