Use locally running LLMs directly from Siri π¦π£
β186Oct 1, 2024Updated last year
Alternatives and similar repositories for SiriLLama
Users that are interested in SiriLLama are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- AI Powered search tool offers content-based, text, and visual similarity system-wide search.β278May 25, 2025Updated 10 months ago
- β45May 5, 2024Updated last year
- After my server ui improvements were successfully merged, consider this repo a playground for experimenting, tinkering and hacking aroundβ¦β53Aug 18, 2024Updated last year
- Local image and music generation for Apple Siliconβ78Mar 22, 2025Updated last year
- A stable, fast and easy-to-use inference library with a focus on a sync-to-async APIβ48Sep 26, 2024Updated last year
- Serverless GPU API endpoints on Runpod - Bonus Credits β’ AdSkip the infrastructure headaches. Auto-scaling, pay-as-you-go, no-ops approach lets you focus on innovating your application.
- An autonomous AI agent extension for Oobabooga's web uiβ172Sep 7, 2023Updated 2 years ago
- A guidance compatibility layer for llama-cpp-pythonβ36Sep 11, 2023Updated 2 years ago
- Spotlight-like client for Ollama on Windows.β28May 18, 2024Updated last year
- run ollama & gguf easily with a single commandβ52May 15, 2024Updated last year
- Use mark to run lots of prompts on lots of dataβ18Aug 12, 2025Updated 8 months ago
- A simple experiment on letting two local LLM have a conversation about anything!β112Jul 3, 2024Updated last year
- an auto-sleeping and -waking framework around llama.cppβ12Feb 8, 2025Updated last year
- β43May 20, 2024Updated last year
- AlwaysReddy is a LLM voice assistant that is always just a hotkey away.β762Mar 4, 2025Updated last year
- Deploy open-source AI quickly and easily - Bonus Offer β’ AdRunpod Hub is built for open source. One-click deployment and autoscaling endpoints without provisioning your own infrastructure.
- OllamaLab: a fully fledged AI assistant utilizing Ollama with Companion Mode for MacOSβ34Sep 11, 2024Updated last year
- Convert Files / Folders / GitHub Repos Into AI / LLM-ready Filesβ163Jan 31, 2025Updated last year
- An LLM agnostic desktop and mobile client.β316Sep 13, 2025Updated 7 months ago
- Clipboard Conqueror is a novel copy and paste copilot alternative designed to bring your very own LLM AI assistant to any text field.β442Jan 11, 2025Updated last year
- FastMLX is a high performance production ready API to host MLX models.β352Mar 18, 2025Updated last year
- Train LLMs by just modifying config files!β24Nov 23, 2023Updated 2 years ago
- Software to implement GoT with a weviate vectorized databaseβ684Mar 25, 2025Updated last year
- Experimental LLM Inference UX to aid in creative writingβ128Dec 14, 2024Updated last year
- A QT GUI for large language modelsβ40Dec 27, 2023Updated 2 years ago
- Serverless GPU API endpoints on Runpod - Bonus Credits β’ AdSkip the infrastructure headaches. Auto-scaling, pay-as-you-go, no-ops approach lets you focus on innovating your application.
- Modified Beam Search with periodical restartβ12Sep 12, 2024Updated last year
- CrewAI template for Autonomeee agnet.β18Oct 1, 2024Updated last year
- "a towel is about the most massively useful thing an interstellar AI hitchhiker can have"β48Oct 9, 2024Updated last year
- β19Sep 4, 2024Updated last year
- Simple LLM inference serverβ20Jun 13, 2024Updated last year
- SiLLM simplifies the process of training and running Large Language Models (LLMs) on Apple Silicon by leveraging the MLX framework.β284Jun 16, 2025Updated 10 months ago
- β14Jun 6, 2024Updated last year
- Locally running LLM with internet accessβ96Jun 30, 2025Updated 9 months ago
- Export Apple Messages data to jsonβ22Mar 7, 2024Updated 2 years ago
- Serverless GPU API endpoints on Runpod - Bonus Credits β’ AdSkip the infrastructure headaches. Auto-scaling, pay-as-you-go, no-ops approach lets you focus on innovating your application.
- Auto-Video maker handling many AI'sβ11Mar 18, 2024Updated 2 years ago
- Gateway and load balancer to your LLM inference endpointsβ26Nov 1, 2024Updated last year
- MLX Transformers is a library that provides model implementation in MLX. It uses a similar model interface as HuggingFace Transformers anβ¦β76Mar 23, 2026Updated 3 weeks ago
- A plugin for llm to support structured outputs.β12Feb 1, 2025Updated last year
- A pytest plugin to organize and track algorithm visualizationsβ18Dec 1, 2024Updated last year
- Accompanying code and SEP dataset for the "Can LLMs Separate Instructions From Data? And What Do We Even Mean By That?" paper.β61Mar 11, 2025Updated last year
- A powerful MCP memory using a knowledge graph powered by elastic searchβ19Apr 1, 2026Updated 2 weeks ago