Mobile-Artificial-Intelligence / maidLinks
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
☆2,148Updated 3 weeks ago
Alternatives and similar repositories for maid
Users that are interested in maid are comparing it to the libraries listed below
Sorting:
- A modern and easy-to-use client for Ollama☆1,435Updated this week
- Simple frontend for LLMs built in react-native.☆1,718Updated last week
- IRIS is an android app for interfacing with GGUF / llama.cpp models locally.☆232Updated 6 months ago
- Running any GGUF SLMs/LLMs locally, on-device in Android☆462Updated last week
- A mobile Implementation of llama.cpp☆314Updated last year
- llama and other large language models on iOS and MacOS offline using GGML library.☆1,852Updated 2 weeks ago
- Stable Diffusion AI client app for Android☆996Updated last week
- Amica is an open source interface for interactive communication with 3D characters with voice synthesis and speech recognition.☆1,045Updated last month
- AubAI brings you on-device gen-AI capabilities, including offline text generation and more, directly within your app.☆299Updated last year
- The official API server for Exllama. OAI compatible, lightweight, and fast.☆1,031Updated this week
- Cross-platform framework for deploying LLM/VLM/TTS models locally on smartphones.☆2,870Updated this week
- Distributed LLM inference. Connect home devices into a powerful cluster to accelerate LLM inference. More devices means faster inference.☆2,283Updated last week
- dart binding for llama.cpp☆246Updated last week
- Model swapping for llama.cpp (or any local OpenAPI compatible server)☆1,333Updated this week
- Local AI API Platform☆2,762Updated last month
- An app that brings language models directly to your phone.☆4,624Updated this week
- the terminal client for Ollama☆2,122Updated last week
- Replace Copilot local AI☆2,040Updated last year
- llama.cpp fork with additional SOTA quants and improved performance☆1,050Updated last week
- LLM Frontend in a single html file☆629Updated 7 months ago
- Simple HTML UI for Ollama☆1,075Updated 6 months ago
- LocalAGI is a powerful, self-hostable AI Agent platform designed for maximum privacy and flexibility. A complete drop-in replacement for …☆1,087Updated this week
- Pure C++ implementation of several models for real-time chatting on your computer (CPU & GPU)☆677Updated this week
- lcpp is a dart implementation of llama.cpp used by the mobile artificial intelligence distribution (maid)☆99Updated this week
- A Ollama client for Android!☆86Updated last year
- ☆252Updated last month
- a self-hosted webui for 30+ generative ai☆607Updated this week
- Web UI for ExLlamaV2☆506Updated 6 months ago
- LM Studio CLI☆3,553Updated last week
- A proxy server for multiple ollama instances with Key security☆480Updated 3 weeks ago