Mobile-Artificial-Intelligence / maidLinks
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
☆2,296Updated 6 months ago
Alternatives and similar repositories for maid
Users that are interested in maid are comparing it to the libraries listed below
Sorting:
- A modern and easy-to-use client for Ollama☆1,616Updated 3 months ago
- Simple frontend for LLMs built in react-native.☆2,042Updated last week
- Running any GGUF SLMs/LLMs locally, on-device in Android☆655Updated 3 weeks ago
- A mobile Implementation of llama.cpp☆325Updated last year
- IRIS is an android app for interfacing with GGUF / llama.cpp models locally.☆264Updated 11 months ago
- An app that brings language models directly to your phone.☆5,574Updated this week
- Stable Diffusion AI client app for Android☆1,120Updated this week
- Chat with your documents using local AI☆1,078Updated last year
- Local AI API Platform☆2,761Updated 6 months ago
- An awesome repository of local AI tools☆1,819Updated last year
- ☆287Updated 2 months ago
- Amica is an open source interface for interactive communication with 3D characters with voice synthesis and speech recognition.☆1,178Updated 6 months ago
- llama and other large language models on iOS and MacOS offline using GGML library.☆1,965Updated last month
- 🔍 AI search engine - self-host with local or cloud LLMs☆3,502Updated last year
- Language Model Playground☆66Updated last week
- Replace Copilot local AI☆2,082Updated last year
- A Ollama client for Android!☆89Updated last year
- An OpenAI API compatible text to speech server using Coqui AI's xtts_v2 and/or piper tts as the backend.☆847Updated 11 months ago
- Compare open-source local LLM inference projects by their metrics to assess popularity and activeness.☆705Updated 2 months ago
- the terminal client for Ollama☆2,305Updated last month
- A modern and easy-to-use client for Ollama☆149Updated last year
- AlwaysReddy is a LLM voice assistant that is always just a hotkey away.☆761Updated 10 months ago
- Lightweight, standalone, multi-platform, and privacy focused local LLM chat interface with optional encryption☆154Updated 9 months ago
- Fast stable diffusion on CPU and AI PC☆1,973Updated 2 weeks ago
- The official API server for Exllama. OAI compatible, lightweight, and fast.☆1,119Updated last week
- Pure C++ implementation of several models for real-time chatting on your computer (CPU & GPU)☆777Updated last week
- Kernels & AI inference engine for mobile devices.☆4,131Updated last week
- The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM …☆612Updated 11 months ago
- Reliable model swapping for any local OpenAI/Anthropic compatible server - llama.cpp, vllm, etc☆2,260Updated last week
- A minimal LLM chat app that runs entirely in your browser☆1,064Updated 3 months ago