YellowRoseCx / koboldcpp-rocmLinks
AI Inferencing at the Edge. A simple one-file way to run various GGML models with KoboldAI's UI with AMD ROCm offloading
☆619Updated last week
Alternatives and similar repositories for koboldcpp-rocm
Users that are interested in koboldcpp-rocm are comparing it to the libraries listed below
Sorting:
- The official API server for Exllama. OAI compatible, lightweight, and fast.☆969Updated this week
- Prebuilt Windows ROCm Libs for gfx1031 and gfx1032☆138Updated 2 months ago
- CUDA on AMD GPUs☆495Updated 2 weeks ago
- The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface. Now ZLUDA enhanced for better AMD GPU p…☆430Updated this week
- AMD (Radeon GPU) ROCm based setup for popular AI tools on Ubuntu 24.04.1☆204Updated 3 months ago
- Forge for stable-diffusion-webui-amdgpu (formerly stable-diffusion-webui-directml)☆127Updated last month
- Web UI for ExLlamaV2☆495Updated 3 months ago
- Croco.Cpp is a 3rd party testground for KoboldCPP, a simple one-file way to run various GGML/GGUF models with KoboldAI's UI. (for Croco.C…☆107Updated this week
- ROCm Library Files for gfx1103 and update with others arches based on AMD GPUs for use in Windows.☆514Updated 4 months ago
- A zero dependency web UI for any LLM backend, including KoboldCpp, OpenAI and AI Horde☆121Updated this week
- ☆326Updated last month
- ☆630Updated this week
- add support on amd in zluda☆59Updated 2 weeks ago
- An extension for SillyTavern that lets characters think before responding☆109Updated 3 months ago
- Extensions API for SillyTavern.☆619Updated 5 months ago
- An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs☆375Updated this week
- Docker variants of oobabooga's text-generation-webui, including pre-built images.☆426Updated this week
- KoboldAI is generative AI software optimized for fictional use, but capable of much more!☆405Updated 4 months ago
- Stable Diffusion Knowledge Base (Setups, Basics, Guides and more)☆78Updated last month
- A Quick Reply Set for Sillytavern to gently guide the Model output☆68Updated 2 months ago
- LLM Frontend in a single html file☆482Updated 4 months ago
- ☆158Updated last year
- Launcher scripts for SillyTavern and ST-Extras.☆355Updated last week
- Stable Diffusion web UI☆2,111Updated last week
- Model swapping for llama.cpp (or any local OpenAPI compatible server)☆848Updated this week
- Amica is an open source interface for interactive communication with 3D characters with voice synthesis and speech recognition.☆960Updated this week
- a self-hosted webui for 30+ generative ai☆582Updated this week
- Large-scale LLM inference engine☆1,435Updated this week
- An AI assistant beyond the chat box.☆329Updated last year
- ☆60Updated this week