vegaluisjose / mlx-ragView external linksLinks
Explore a simple example of utilizing MLX for RAG application running locally on your Apple Silicon device.
☆178Jan 31, 2024Updated 2 years ago
Alternatives and similar repositories for mlx-rag
Users that are interested in mlx-rag are comparing it to the libraries listed below
Sorting:
- run embeddings in MLX☆97Sep 27, 2024Updated last year
- Minimal, clean code implementation of RAG with mlx using gguf model weights☆53Apr 27, 2024Updated last year
- MLX-Embeddings is the best package for running Vision and Language Embedding models locally on your Mac using MLX.☆273Updated this week
- Examples for using the SiLLM framework for training and running Large Language Models (LLMs) on Apple Silicon☆16May 8, 2025Updated 9 months ago
- MLX implementation of GCN, with benchmark on MPS, CUDA and CPU (M1 Pro, M2 Ultra, M3 Max).☆24Dec 16, 2023Updated 2 years ago
- A simple UI / Web / Frontend for MLX mlx-lm using Streamlit.☆260Oct 25, 2025Updated 3 months ago
- GenAI & agent toolkit for Apple Silicon Mac, implementing JSON schema-steered structured output (3SO) and tool-calling in Python. For mor…☆132Dec 8, 2025Updated 2 months ago
- Run and train GPT-2 on Apple silicon☆33Feb 6, 2024Updated 2 years ago
- mlx image models for Apple Silicon machines☆91Nov 30, 2025Updated 2 months ago
- A fast minimalistic implementation of guided generation on Apple Silicon using Outlines and MLX☆59Feb 9, 2024Updated 2 years ago
- 🧠 Retrieval Augmented Generation (RAG) example☆19Aug 18, 2025Updated 5 months ago
- An example implementation of RLHF (or, more accurately, RLAIF) built on MLX and HuggingFace.☆37Jun 21, 2024Updated last year
- Port of Suno's Bark TTS transformer in Apple's MLX Framework☆86Feb 11, 2024Updated 2 years ago
- Tool for exporting Apple Neural Engine-accelerated versions of transformers models on HuggingFace Hub.☆13May 2, 2023Updated 2 years ago
- Very basic framework for composable parameterized large language model (Q)LoRA / (Q)Dora fine-tuning using mlx, mlx_lm, and OgbujiPT.