antranapp / awesome-mlxLinks
☆196Updated 10 months ago
Alternatives and similar repositories for awesome-mlx
Users that are interested in awesome-mlx are comparing it to the libraries listed below
Sorting:
- MLX-Embeddings is the best package for running Vision and Language Embedding models locally on your Mac using MLX.☆263Updated 2 weeks ago
- FastMLX is a high performance production ready API to host MLX models.☆341Updated 10 months ago
- SiLLM simplifies the process of training and running Large Language Models (LLMs) on Apple Silicon by leveraging the MLX framework.☆284Updated 7 months ago
- MLX Model Manager unifies loading and inferencing with LLMs and VLMs.☆103Updated last year
- Explore a simple example of utilizing MLX for RAG application running locally on your Apple Silicon device.☆178Updated 2 years ago
- Chat with MLX is a high-performance macOS application that connects your local documents to a personalized large language model (LLM).☆176Updated last year
- Phi-3.5 for Mac: Locally-run Vision and Language Models for Apple Silicon☆273Updated 2 months ago
- The easiest way to run the fastest MLX-based LLMs locally☆308Updated last year
- ☆77Updated last year
- Large Language Models (LLMs) applications and tools running on Apple Silicon in real-time with Apple MLX.☆458Updated last year
- CLI to demonstrate running a large language model (LLM) on Apple Neural Engine.☆119Updated last year
- Start a server from the MLX library.☆196Updated last year
- Benchmark of Apple MLX operations on all Apple Silicon chips (GPU, CPU) + MPS and CUDA.☆214Updated 3 weeks ago
- mlx image models for Apple Silicon machines☆91Updated 2 months ago
- For inferring and serving local LLMs using the MLX framework☆110Updated last year
- A simple UI / Web / Frontend for MLX mlx-lm using Streamlit.☆260Updated 3 months ago
- Python tools for WhisperKit: Model conversion, optimization and evaluation☆235Updated 2 months ago
- MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. I…☆656Updated last month
- Your gateway to both Ollama & Apple MlX models☆149Updated 11 months ago
- 📋 NotebookMLX - An Open Source version of NotebookLM (Ported NotebookLlama)☆335Updated 10 months ago
- Train Large Language Models on MLX.☆245Updated this week
- CLI tool for text to image generation using the FLUX.1 model.☆67Updated 7 months ago
- run embeddings in MLX☆97Updated last year
- An LLM agnostic desktop and mobile client.☆315Updated 4 months ago
- A multi-platform SwiftUI frontend for running local LLMs with Apple's MLX framework.☆429Updated last year
- Swift implementation of Flux.1 using mlx-swift☆112Updated 5 months ago
- Minimal, clean code implementation of RAG with mlx using gguf model weights☆53Updated last year
- ☆307Updated 9 months ago
- MLX-GUI MLX Inference Server for Apple Silicone☆176Updated 2 weeks ago
- Blazing fast whisper turbo for ASR (speech-to-text) tasks☆218Updated 2 months ago