AI Inferencing at the Edge. A simple one-file way to run various GGML models with KoboldAI's UI with AMD ROCm offloading
☆746Dec 30, 2025Updated 2 months ago
Alternatives and similar repositories for koboldcpp-rocm
Users that are interested in koboldcpp-rocm are comparing it to the libraries listed below
Sorting:
- Run GGUF models easily with a KoboldAI UI. One File. Zero Install.☆9,544Feb 24, 2026Updated last week
- The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface. Now ZLUDA enhanced for better AMD GPU p…☆825Updated this week
- Prebuilt Windows ROCm Libs for gfx1031 and gfx1032☆170Mar 19, 2025Updated 11 months ago
- CUDA on AMD GPUs☆601Feb 11, 2026Updated 2 weeks ago
- Stable Diffusion web UI☆2,329Dec 31, 2025Updated 2 months ago
- Forge for stable-diffusion-webui-amdgpu (formerly stable-diffusion-webui-directml)☆170Sep 29, 2025Updated 5 months ago
- Installation script for an AI applications using ROCm on Linux.☆39Updated this week
- AMD-SHARK Studio -- Web UI for SHARK+IREE High Performance Machine Learning Distribution☆1,453Dec 15, 2025Updated 2 months ago
- OneTrainer is a one-stop solution for all your stable diffusion training needs.☆16Oct 6, 2025Updated 4 months ago
- add support on amd in zluda☆77Jul 21, 2025Updated 7 months ago
- CUDA on non-NVIDIA GPUs☆13,975Updated this week
- My own ROCm windows builds from TheRock repository for various architectures such as 680m, 780m, rx6600, etc.☆48Dec 8, 2025Updated 2 months ago
- Loader extension for tabbyAPI in SillyTavern☆26Jun 30, 2025Updated 8 months ago
- AITemplate is a Python framework which renders neural network into high performance CUDA/HIP C++ code. Specialized for FP16 TensorCore (N…☆12Jun 24, 2024Updated last year
- Stable Diffusion Knowledge Base (Setups, Basics, Guides and more)☆119Jan 30, 2026Updated last month
- ☆16Feb 21, 2026Updated last week
- Fast and memory-efficient exact attention☆221Updated this week
- The official API server for Exllama. OAI compatible, lightweight, and fast.☆1,134Feb 9, 2026Updated 3 weeks ago
- SD.Next: All-in-one WebUI for AI generative image and video creation, captioning and processing☆6,970Updated this week
- A password validation and generation tool kit☆15Jan 7, 2023Updated 3 years ago
- A fast inference library for running LLMs locally on modern consumer-class GPUs☆4,444Dec 9, 2025Updated 2 months ago
- Everything you need to setup on your AMD system for Machine Learning Stuff☆19Jul 31, 2025Updated 7 months ago
- An extension for SillyTavern that lets characters think before responding☆144Jan 9, 2026Updated last month
- Croco.Cpp is fork of KoboldCPP infering GGML/GGUF models on CPU/Cuda with KoboldAI's UI. It's powered partly by IK_LLama.cpp, and compati…☆158Updated this week
- llama-swap + a minimal ollama compatible api☆49Feb 13, 2026Updated 2 weeks ago
- The HIP Environment and ROCm Kit - A lightweight open source build system for HIP and ROCm☆122Feb 12, 2026Updated 2 weeks ago
- Multi-turn dataset management tool for LLM trainers☆12Mar 31, 2025Updated 11 months ago
- Provides an interface for extensions to use language models directly in the browser.☆16Feb 7, 2026Updated 3 weeks ago
- LLM Frontend for Power Users.☆23,509Updated this week
- llama.cpp fork with additional SOTA quants and improved performance☆1,696Updated this week
- Scripts and tools for optimizing quantizations in llama.cpp with GGUF imatrices.☆18Jan 10, 2025Updated last year
- Quick and easy Diffusers CLI☆15Jan 30, 2026Updated last month
- 111 VRM Animation Pack: For use with Silly Tavern☆64Jul 16, 2025Updated 7 months ago
- The definitive Web UI for local AI, with powerful features and easy setup.☆46,091Feb 3, 2026Updated last month
- After my server ui improvements were successfully merged, consider this repo a playground for experimenting, tinkering and hacking around…☆53Aug 18, 2024Updated last year
- UI for ONNX based diffusers☆190May 31, 2023Updated 2 years ago
- Analyze Reddit posts☆30Feb 27, 2025Updated last year
- Create text chunks which end at natural stopping points without using a tokenizer☆26Nov 26, 2025Updated 3 months ago
- A proxy that hosts multiple single-model runners such as LLama.cpp and vLLM☆12May 30, 2025Updated 9 months ago