YellowRoseCx / koboldcpp-rocmLinks
AI Inferencing at the Edge. A simple one-file way to run various GGML models with KoboldAI's UI with AMD ROCm offloading
☆647Updated 2 weeks ago
Alternatives and similar repositories for koboldcpp-rocm
Users that are interested in koboldcpp-rocm are comparing it to the libraries listed below
Sorting:
- Prebuilt Windows ROCm Libs for gfx1031 and gfx1032☆147Updated 3 months ago
- CUDA on AMD GPUs☆527Updated 2 months ago
- The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface. Now ZLUDA enhanced for better AMD GPU p…☆480Updated this week
- The official API server for Exllama. OAI compatible, lightweight, and fast.☆1,000Updated this week
- AMD (Radeon GPU) ROCm based setup for popular AI tools on Ubuntu 24.04.1☆209Updated 4 months ago
- ROCm Library Files for gfx1103 and update with others arches based on AMD GPUs for use in Windows.☆549Updated 5 months ago
- Forge for stable-diffusion-webui-amdgpu (formerly stable-diffusion-webui-directml)☆134Updated 2 weeks ago
- add support on amd in zluda☆65Updated 3 weeks ago
- ☆356Updated 3 months ago
- Run GGUF models easily with a KoboldAI UI. One File. Zero Install.☆7,773Updated this week
- Next-generation AI roleplay system☆91Updated this week
- Stable Diffusion web UI☆2,149Updated 3 weeks ago
- Croco.Cpp is fork of KoboldCPP infering GGML/GGUF models on CPU/Cuda with KoboldAI's UI. It's powered partly by IK_LLama.cpp, and compati…☆111Updated this week
- KoboldAI is generative AI software optimized for fictional use, but capable of much more!☆412Updated 6 months ago
- LLM Frontend in a single html file☆517Updated 6 months ago
- Web UI for ExLlamaV2☆503Updated 5 months ago
- Extensions API for SillyTavern.☆637Updated 7 months ago
- ☆233Updated 2 years ago
- An extension for SillyTavern that lets characters think before responding☆112Updated 4 months ago
- Launcher scripts for SillyTavern and ST-Extras.☆367Updated last week
- Stable Diffusion Docker image preconfigured for usage with AMD Radeon cards☆136Updated last year
- A zero dependency web UI for any LLM backend, including KoboldCpp, OpenAI and AI Horde☆126Updated this week
- Run stable-diffusion-webui with Radeon RX 580 8GB on Ubuntu 22.04.2 LTS☆64Updated last year
- An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs☆436Updated this week
- Stable Diffusion Knowledge Base (Setups, Basics, Guides and more)☆86Updated last week
- ☆158Updated last year
- A Quick Reply Set for Sillytavern to gently guide the Model output☆72Updated 3 months ago
- A script that automatically installs all the required stuff to run selected AI interfaces on AMD Radeon 7900XTX.☆27Updated this week
- My personal fork of koboldcpp where I hack in experimental samplers.☆46Updated last year
- Docker variants of oobabooga's text-generation-webui, including pre-built images.☆433Updated last week