geerlingguy / beowulf-ai-clusterLinks
AI Cluster deployed with Ansible on Random computers with random capabilities
☆188Updated 2 weeks ago
Alternatives and similar repositories for beowulf-ai-cluster
Users that are interested in beowulf-ai-cluster are comparing it to the libraries listed below
Sorting:
- Linux distro for AI computers. Go from bare-metal GPUs to running AI workloads - like vLLM, SGLang, RAG, and Agents - in minutes, fully a…☆244Updated this week
- Simple ollama benchmarking tool.☆131Updated 3 weeks ago
- Curated list of tools, frameworks, and resources for running, building, and deploying AI privately — on-prem, air-gapped, or self-hosted.☆136Updated this week
- Lemonade helps users run local LLMs with the highest performance by configuring state-of-the-art inference engines for their NPUs and GPU…☆1,178Updated this week
- A no-install needed web-GUI for Ollama.☆425Updated this week
- Let LLMs control embedded devices via the Model Context Protocol.☆144Updated 2 months ago
- ☆66Updated this week
- This is a cross-platform desktop application that allows you to chat with locally hosted LLMs and enjoy features like MCP support☆222Updated 3 weeks ago
- ☆166Updated last week
- 100% Local Memory layer and Knowledge base for agents with WebUI☆459Updated last month
- Run & debug workflows for AI agents running Dockerized tools in VSCode☆96Updated last month
- A multi-agent AI architecture that connects 25+ specialized agents through n8n and MCP servers. Project NOVA routes requests to domain-sp…☆214Updated 2 months ago
- ☆144Updated 3 weeks ago
- Model Context Protocol server for Cyclops☆29Updated 2 months ago
- Abbey is a self-hosted configurable AI interface with workspaces, document chats, YouTube chats, and more. Find our hosted version at htt…☆400Updated 4 months ago
- ScribePal is an Open Source intelligent browser extension that leverages AI to empower your web experience by providing contextual insigh…☆18Updated last week
- Work with LLMs on a local environment using containers☆251Updated this week
- LLM Benchmark for Throughput via Ollama (Local LLMs)☆286Updated 3 weeks ago
- ☆122Updated 3 weeks ago
- User-friendly AI Interface (Supports Ollama, OpenAI API, ...)☆103Updated 5 months ago
- beep boop 🤖 (experimental)☆113Updated 7 months ago
- System Power Monitoring using Smart Plugs from the Terminal☆203Updated 4 months ago
- One-click ML infrastructure for teams that just want to get sh*t done.☆123Updated 2 months ago
- Local AI voice assistant stack for Home Assistant (GPU-accelerated) with persistent memory, follow-up conversation, and Ollama model reco…☆193Updated last month
- 🤖🕰️ An MCP server that gives language models temporal awareness and time calculation abilities. Teaching AI the significance of the pas…☆699Updated 2 months ago
- A local, privacy-first résumé builder using LLMs and Markdown to generate ATS-ready DOCX files with Pandoc — no cloud, no tracking.☆332Updated last week
- Self-contained GitOps environment for deterministic, recursively bootstrapped container automation on Proxmox VE.☆104Updated this week
- OpenAPI Tool Servers☆629Updated 2 months ago
- API up your Ollama Server.☆173Updated 2 months ago
- Generate and execute command line commands using LLM☆49Updated 6 months ago