simonw / llm-mlxLinks
Support for MLX models in LLM
☆185Updated 2 months ago
Alternatives and similar repositories for llm-mlx
Users that are interested in llm-mlx are comparing it to the libraries listed below
Sorting:
- LLM plugin providing access to models running on an Ollama server☆316Updated last week
- LLM plugin for running models using llama.cpp☆143Updated last year
- LLM plugin for models hosted by OpenRouter☆180Updated last month
- LLM plugin to access Google's Gemini family of models☆341Updated this week
- LLM plugin for running models using MLC☆187Updated last year
- llm-consortium orchestrates mulitple LLMs, iteratively refines & achieves consensus.☆249Updated last week
- LLM access to models by Anthropic, including the Claude series☆122Updated 3 weeks ago
- CLI tool for running text through OpenAI Text to speech☆168Updated last year
- FastMLX is a high performance production ready API to host MLX models.☆308Updated 3 months ago
- Load GitHub repository contents as LLM fragments☆56Updated last month
- Local coding agent with neat UI☆198Updated last month
- The Official Model Context Protocol (MCP) server for Kagi search & other tools.☆113Updated 3 weeks ago
- LLM access to pplx-api☆30Updated 2 weeks ago
- LLM plugin for interacting with the Claude 3 family of models☆292Updated 4 months ago
- Run larger LLMs with longer contexts on Apple Silicon by using differentiated precision for KV cache quantization. KVSplit enables 8-bit …☆351Updated last month
- Ask questions of your data with LLM assistance☆64Updated 6 months ago
- Your gateway to both Ollama & Apple MlX models☆135Updated 3 months ago
- Easily copy all relevant source files in a repository to clipboard. For use in LLM code understanding and generation workflows☆222Updated 4 months ago
- Use LLM to generate and execute commands in your shell☆405Updated 3 weeks ago
- MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. I…☆418Updated last week
- Fine-grained control over model context protocol (MCP) clients, servers, and tools. Context is God.☆111Updated last week
- LLM plugin providing access to Mistral models using the Mistral API☆183Updated 3 weeks ago
- SiLLM simplifies the process of training and running Large Language Models (LLMs) on Apple Silicon by leveraging the MLX framework.☆273Updated last week
- A cookiecutter template for creating a new LLM plugin that adds tools to LLM☆25Updated 3 weeks ago
- Concatenated documentation for use with LLMs☆37Updated last week
- Virtual environment stacks for Python☆258Updated this week
- CLI tool for stripping tags from HTML☆325Updated 3 months ago
- ☆128Updated 2 weeks ago
- A python package for serving LLM on OpenAI-compatible API endpoints with prompt caching using MLX.☆85Updated this week
- A wannabe Ollama equivalent for Apple MlX models☆68Updated 3 months ago