neobundy / MLX-Stable-Diffusion-WebUI
MLX Stable Diffusion WebUI for Apple MLX Stable Diffusion example code.
☆102Updated last year
Alternatives and similar repositories for MLX-Stable-Diffusion-WebUI
Users that are interested in MLX-Stable-Diffusion-WebUI are comparing it to the libraries listed below
Sorting:
- A simple UI / Web / Frontend for MLX mlx-lm using Streamlit.☆253Updated 3 months ago
- huggingface chat-ui integration with mlx-lm server☆60Updated last year
- A wannabe Ollama equivalent for Apple MlX models☆67Updated 2 months ago
- SiLLM simplifies the process of training and running Large Language Models (LLMs) on Apple Silicon by leveraging the MLX framework.☆266Updated this week
- ☆171Updated 9 months ago
- Chat with MLX is a high-performance macOS application that connects your local documents to a personalized large language model (LLM).☆174Updated last year
- A little file for doing LLM-assisted prompt expansion and image generation using Flux.schnell - complete with prompt history, prompt queu…☆26Updated 9 months ago
- A simple script to enhance text editing across your Mac, leveraging the power of MLX. Designed for seamless integration, it offers real-t…☆104Updated last year
- Gradio based tool to run opensource LLM models directly from Huggingface☆91Updated 10 months ago
- Python tools for WhisperKit: Model conversion, optimization and evaluation☆213Updated last week
- ☆169Updated last month
- A Streamlit application allows users to generate images using SD3 via Stability AI API.☆58Updated last year
- Chat with any website on your local machine☆74Updated 10 months ago
- All the world is a play, we are but actors in it.☆49Updated this week
- ☆282Updated 11 months ago
- For inferring and serving local LLMs using the MLX framework☆103Updated last year
- Phi-3.5 for Mac: Locally-run Vision and Language Models for Apple Silicon☆265Updated 8 months ago
- MacOS Agent: A Simplified Assistant for Your Mac☆79Updated 9 months ago
- Scripts to create your own moe models using mlx☆89Updated last year
- The easiest way to run the fastest MLX-based LLMs locally☆280Updated 6 months ago
- Local ML voice chat using high-end models.☆163Updated this week
- Filter X content using LLM API requests, configurable, based on Groq API☆131Updated 9 months ago
- Port of Suno's Bark TTS transformer in Apple's MLX Framework☆81Updated last year
- Minimal, clean code implementation of RAG with mlx using gguf model weights☆50Updated last year
- ☆52Updated last month
- A set of custom nodes for ComfyUI that allow you to use Core ML models in your ComfyUI workflows.☆158Updated last month
- The source repository to manage "Community" section of models and LoRAs in the Draw Things app.☆39Updated this week
- Your gateway to both Ollama & Apple MlX models☆131Updated 2 months ago
- Explore a simple example of utilizing MLX for RAG application running locally on your Apple Silicon device.☆169Updated last year
- mlx implementations of various transformers, speedups, training☆34Updated last year