YellowRoseCx / koboldcpp-rocmLinks
AI Inferencing at the Edge. A simple one-file way to run various GGML models with KoboldAI's UI with AMD ROCm offloading
☆721Updated last month
Alternatives and similar repositories for koboldcpp-rocm
Users that are interested in koboldcpp-rocm are comparing it to the libraries listed below
Sorting:
- The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface. Now ZLUDA enhanced for better AMD GPU p…☆728Updated last week
- CUDA on AMD GPUs☆590Updated 3 months ago
- Prebuilt Windows ROCm Libs for gfx1031 and gfx1032☆167Updated 9 months ago
- The official API server for Exllama. OAI compatible, lightweight, and fast.☆1,100Updated last week
- AMD (Radeon GPU) ROCm based setup for popular AI tools on Ubuntu 24.04.1☆216Updated 3 weeks ago
- add support on amd in zluda☆77Updated 5 months ago
- Forge for stable-diffusion-webui-amdgpu (formerly stable-diffusion-webui-directml)☆166Updated 2 months ago
- AMD-SHARK Studio -- Web UI for SHARK+IREE High Performance Machine Learning Distribution☆1,451Updated last week
- Web UI for ExLlamaV2☆514Updated 10 months ago
- Croco.Cpp is fork of KoboldCPP infering GGML/GGUF models on CPU/Cuda with KoboldAI's UI. It's powered partly by IK_LLama.cpp, and compati…☆154Updated last week
- Extensions API for SillyTavern.☆657Updated last year
- An extension for SillyTavern that lets characters think before responding☆131Updated last month
- LLM Frontend in a single html file☆672Updated 2 weeks ago
- ☆419Updated 8 months ago
- Stable Diffusion Docker image preconfigured for usage with AMD Radeon cards☆141Updated last year
- Run GGUF models easily with a KoboldAI UI. One File. Zero Install.☆9,104Updated this week
- An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs☆605Updated last week
- KoboldAI is generative AI software optimized for fictional use, but capable of much more!☆418Updated 11 months ago
- Next-generation AI roleplay system☆286Updated last week
- ☆671Updated last week
- Launcher scripts for SillyTavern and ST-Extras.☆432Updated 2 weeks ago
- A zero dependency web UI for any LLM backend, including KoboldCpp, OpenAI and AI Horde☆147Updated this week
- Stable Diffusion web UI☆2,286Updated last month
- An OpenAI API compatible text to speech server using Coqui AI's xtts_v2 and/or piper tts as the backend.☆837Updated 10 months ago
- llama.cpp fork with additional SOTA quants and improved performance☆1,390Updated this week
- Stable Diffusion Knowledge Base (Setups, Basics, Guides and more)☆113Updated 5 months ago
- a self-hosted webui for 30+ generative ai☆647Updated last week
- The HIP Environment and ROCm Kit - A lightweight open source build system for HIP and ROCm☆118Updated last week
- Ginger is a stand-alone editor for LLM character cards.☆80Updated 3 months ago
- ☆157Updated 2 years ago