anurmatov / mac-studio-serverLinks
Optimized Ollama LLM server configuration for Mac Studio and other Apple Silicon Macs. Headless setup with automatic startup, resource optimization, and remote management via SSH.
☆265Updated 7 months ago
Alternatives and similar repositories for mac-studio-server
Users that are interested in mac-studio-server are comparing it to the libraries listed below
Sorting:
- MLX-GUI MLX Inference Server for Apple Silicone☆127Updated 2 months ago
- This is a cross-platform desktop application that allows you to chat with locally hosted LLMs and enjoy features like MCP support☆224Updated 2 months ago
- Notate is a desktop chat application that takes AI conversations to the next level. It combines the simplicity of chat with advanced feat…☆256Updated 8 months ago
- High-performance MLX-based LLM inference engine for macOS with native Swift implementation☆419Updated last month
- llmbasedos — Local-First OS Where Your AI Agents Wake Up and Work☆276Updated 2 months ago
- MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. I…☆587Updated last week
- A beautiful local-first coding agent running in your terminal - built by the community for the community ⚒☆749Updated this week
- ollama like cli tool for MLX models on huggingface (pull, rm, list, show, serve etc.)☆108Updated last week
- Accessing Apple Intelligence and ChatGPT desktop through OpenAI / Ollama API☆297Updated 2 months ago
- Run and monitor MCP servers locally☆196Updated 2 months ago
- A lightweight UI for chatting with Ollama models. Streaming responses, conversation history, and multi-model support.☆125Updated 7 months ago
- Qwen Image models through MPS☆215Updated last month
- Your gateway to both Ollama & Apple MlX models☆146Updated 7 months ago
- A MCP server allowing LLM agents to easily connect and retrieve data from any database☆98Updated 2 months ago
- Welcome!☆140Updated 10 months ago
- Parse files (e.g. code repos) and websites to clipboard or a file for ingestions by AI / LLMs☆307Updated 2 months ago
- Local debugging agent that runs in your terminal☆392Updated 4 months ago
- Fast local speech-to-text for any app using faster-whisper☆141Updated last month
- A multi-agent AI architecture that connects 25+ specialized agents through n8n and MCP servers. Project NOVA routes requests to domain-sp…☆226Updated 4 months ago
- The easiest way to run the fastest MLX-based LLMs locally☆304Updated 11 months ago
- Station is our open-source runtime that lets teams deploy agents on their own infrastructure with full control.☆314Updated last week
- Local AI voice assistant stack for Home Assistant (GPU-accelerated) with persistent memory, follow-up conversation, and Ollama model reco…☆212Updated 3 months ago
- LLM search engine faster than perplexity!☆365Updated 2 months ago
- pdfLLM is a completely open source, proof of concept RAG app.☆163Updated last month
- All-in-one MCP server that can connect your AI agents to any native endpoint, powered by UTCP☆143Updated last month
- cli deep research tool using ollama (AI agent to reseach complexe query online)☆70Updated last week
- MCP Playbooks for AI agents☆351Updated this week
- A command-line utility to manage MLX models between your Hugging Face cache and LM Studio.☆64Updated 8 months ago
- ☆94Updated last week
- Claude Code with any LLM☆213Updated 2 months ago