psugihara / FreeChatLinks
llama.cpp based AI chat app for macOS
☆497Updated 9 months ago
Alternatives and similar repositories for FreeChat
Users that are interested in FreeChat are comparing it to the libraries listed below
Sorting:
- An LLM agnostic desktop and mobile client.☆292Updated 3 months ago
- User Interface made for Ollama.ai using Swift☆352Updated last month
- A multi-platform SwiftUI frontend for running local LLMs with Apple's MLX framework.☆419Updated 10 months ago
- The easiest way to run the fastest MLX-based LLMs locally☆297Updated 10 months ago
- SiLLM simplifies the process of training and running Large Language Models (LLMs) on Apple Silicon by leveraging the MLX framework.☆278Updated 2 months ago
- Mac compatible Ollama Voice☆496Updated last week
- Swift library to work with llama and other large language models.☆268Updated 3 weeks ago
- Mac app to demonstrate swift-transformers☆570Updated last year
- Large Language Models (LLMs) applications and tools running on Apple Silicon in real-time with Apple MLX.☆454Updated 7 months ago
- Use Ollama to talk to local LLMs in Apple Notes☆693Updated 3 weeks ago
- An extremely fast implementation of whisper optimized for Apple Silicon using MLX.☆770Updated last year
- FastMLX is a high performance production ready API to host MLX models.☆325Updated 5 months ago
- llama and other large language models on iOS and MacOS offline using GGML library.☆1,855Updated 3 weeks ago
- Ollama client for Swift☆312Updated 5 months ago
- A simple UI / Web / Frontend for MLX mlx-lm using Streamlit.☆261Updated 2 months ago
- All-in-one native macOS AI chat application: Deepseek, ChatGPT, Claude, xAI Grok, Google Gemini, Perplexity, OpenRouter, and all Open AI-…☆667Updated last week
- ☆186Updated 5 months ago
- Mac app for Ollama☆1,874Updated 5 months ago
- Large Language Model (LLM) module for the Spezi Ecosystem☆262Updated this week
- MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. I…☆537Updated last week
- Stable Diffusion implementation using CoreML, PyTorch and SwiftUI☆169Updated last year
- Chat with MLX is a high-performance macOS application that connects your local documents to a personalized large language model (LLM).☆177Updated last year
- Local ML voice chat using high-end models.☆175Updated last week
- Fork of llama.cpp, supporting Facebook's LLaMA model in Swift☆183Updated 2 years ago
- Phi-3.5 for Mac: Locally-run Vision and Language Models for Apple Silicon☆273Updated 11 months ago
- All-in-one desktop app for running LLMs locally.☆457Updated 3 weeks ago
- LM Studio Apple MLX engine☆756Updated this week
- Explore a simple example of utilizing MLX for RAG application running locally on your Apple Silicon device.☆174Updated last year
- From anywhere you can type, query and stream the output of any script (e.g. an LLM)☆495Updated last year
- A wannabe Ollama equivalent for Apple MlX models☆79Updated 6 months ago