johnmai-dev / ChatMLX
π€β¨ChatMLX is a modern, open-source, high-performance chat application for MacOS based on large language models.
β757Updated last week
Alternatives and similar repositories for ChatMLX:
Users that are interested in ChatMLX are comparing it to the libraries listed below
- MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. Iβ¦β278Updated this week
- Making the community's best AI chat models available to everyone.β1,937Updated last month
- A native macOS app for chatting with local LLMsβ363Updated 5 months ago
- Apple MLX engine for LM Studioβ466Updated this week
- Generate accurate transcripts using Apple's MLX frameworkβ382Updated 3 months ago
- The easiest way to run the fastest MLX-based LLMs locallyβ262Updated 4 months ago
- A multi-platform SwiftUI frontend for running local LLMs with Apple's MLX framework.β398Updated 4 months ago
- Examples using MLX Swiftβ1,634Updated this week
- Swift API for MLXβ1,032Updated last week
- All-in-one native macOS AI chat application: Deepseek, ChatGPT, Claude, xAI Grok, Google Gemini, Perplexity, OpenRouter, and all Open AI-β¦β406Updated 2 weeks ago
- On-device Diffusion Models for Apple Siliconβ599Updated 3 months ago
- Use Ollama to talk to local LLMs in Apple Notesβ658Updated 5 months ago
- FastMLX is a high performance production ready API to host MLX models.β272Updated 2 weeks ago
- β147Updated this week
- MLX-VLM is a package for inference and fine-tuning of Vision Language Models (VLMs) on your Mac using MLX.β1,027Updated this week
- A MLX port of FLUX based on the Huggingface Diffusers implementation.β1,277Updated this week
- Witsy: desktop AI assistantβ759Updated this week
- π NotebookMLX - An Open Source version of NotebookLM (Ported NotebookLlama)β263Updated 2 weeks ago
- An extremely fast implementation of whisper optimized for Apple Silicon using MLX.β675Updated 10 months ago
- An experiment in meeting transcription and diarization with just an LLM. Maybe I went a little overboard thoughβ528Updated last month
- Ollama client for Swiftβ281Updated last week
- Chat with MLX is a high-performance macOS application that connects your local documents to a personalized large language model (LLM).β171Updated last year
- chat with private and local large language modelsβ1,890Updated last month
- Your Local Artificial Memory on your Device.β471Updated 2 months ago