apeatling / ollama-voice-mac
Mac compatible Ollama Voice
☆401Updated 5 months ago
Related projects: ⓘ
- SiLLM simplifies the process of training and running Large Language Models (LLMs) on Apple Silicon by leveraging the MLX framework.☆207Updated last week
- plug whisper audio transcription to a local ollama server and ouput tts audio responses☆196Updated 10 months ago
- Examples of using E2B☆652Updated this week
- From anywhere you can type, query and stream the output of an LLM or any other script☆442Updated 5 months ago
- A multi-platform desktop application to evaluate and compare LLM models, written in Rust and React.☆430Updated this week
- Local AI talk with a custom voice based on Zephyr 7B model. Uses RealtimeSTT with faster_whisper for transcription and RealtimeTTS with C…☆475Updated last month
- Use locally running LLMs directly from Siri 🦙🟣☆145Updated 3 weeks ago
- The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM …☆467Updated last month
- ScribeWizard: Generate organized notes from audio using Groq, Whisper, and Llama3☆412Updated 3 weeks ago
- An extremely fast implementation of whisper optimized for Apple Silicon using MLX.☆519Updated 4 months ago
- Local voice chatbot for engaging conversations, powered by Ollama, Hugging Face Transformers, and Coqui TTS Toolkit☆677Updated last month
- Go manage your Ollama models☆371Updated this week
- Python & JS/TS SDK for running AI-generated code/code interpreting in your AI app☆1,097Updated this week
- ☆278Updated 3 months ago
- FastMLX is a high performance production ready API to host MLX models.☆163Updated last week
- Local semantic search. Stupidly simple.☆378Updated 2 months ago
- AlwaysReddy is a LLM voice assistant that is always just a hotkey away.☆606Updated last week
- An application for running LLMs locally on your device, with your documents, facilitating detailed citations in generated responses.☆459Updated this week
- Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.☆245Updated 3 months ago
- Building AI agents, atomically☆344Updated this week
- Phi-3.5 for Mac: Locally-run Vision and Language Models for Apple Silicon☆206Updated last week
- Plugin that lets you use LM Studio to ask questions about your documents including audio and video files.☆267Updated this week
- Your Trusty Memory-enabled AI Companion - Simple RAG chatbot optimized for local LLMs | 12 Languages Supported | OpenAI API Compatible☆229Updated last month
- LLMX; Easiest 3rd party Local LLM UI for the web!☆144Updated last month
- A simple UI / Web / Frontend for MLX mlx-lm using Streamlit.☆219Updated 2 months ago
- ☆423Updated last week
- Chat with your documents using local AI☆885Updated 2 months ago
- On-device Inference of Diffusion Models for Apple Silicon☆434Updated last week
- Link you Ollama models to LM-Studio☆107Updated 2 months ago
- The easiest way to run the fastest MLX-based LLMs locally☆202Updated 2 months ago