YellowRoseCx / koboldcpp-rocmLinks
AI Inferencing at the Edge. A simple one-file way to run various GGML models with KoboldAI's UI with AMD ROCm offloading
☆718Updated 2 weeks ago
Alternatives and similar repositories for koboldcpp-rocm
Users that are interested in koboldcpp-rocm are comparing it to the libraries listed below
Sorting:
- The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface. Now ZLUDA enhanced for better AMD GPU p…☆700Updated last week
- Prebuilt Windows ROCm Libs for gfx1031 and gfx1032☆167Updated 8 months ago
- CUDA on AMD GPUs☆584Updated 3 months ago
- The official API server for Exllama. OAI compatible, lightweight, and fast.☆1,096Updated 2 weeks ago
- Forge for stable-diffusion-webui-amdgpu (formerly stable-diffusion-webui-directml)☆165Updated 2 months ago
- AMD (Radeon GPU) ROCm based setup for popular AI tools on Ubuntu 24.04.1☆216Updated last week
- add support on amd in zluda☆77Updated 4 months ago
- ROCm Library Files for gfx1103 and update with others arches based on AMD GPUs for use in Windows.☆704Updated 2 months ago
- Croco.Cpp is fork of KoboldCPP infering GGML/GGUF models on CPU/Cuda with KoboldAI's UI. It's powered partly by IK_LLama.cpp, and compati…☆153Updated this week
- Web UI for ExLlamaV2☆514Updated 10 months ago
- ☆418Updated 8 months ago
- An extension for SillyTavern that lets characters think before responding☆132Updated 2 weeks ago
- LLM Frontend in a single html file☆670Updated 3 weeks ago
- Extensions API for SillyTavern.☆653Updated 11 months ago
- Next-generation AI roleplay system☆283Updated this week
- Launcher scripts for SillyTavern and ST-Extras.☆419Updated last week
- KoboldAI is generative AI software optimized for fictional use, but capable of much more!☆417Updated 10 months ago
- ☆235Updated 2 years ago
- Stable Diffusion web UI☆2,276Updated 3 weeks ago
- A zero dependency web UI for any LLM backend, including KoboldCpp, OpenAI and AI Horde☆145Updated last week
- The HIP Environment and ROCm Kit - A lightweight open source build system for HIP and ROCm☆117Updated last week
- Stable Diffusion Knowledge Base (Setups, Basics, Guides and more)☆112Updated 5 months ago
- An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs☆588Updated this week
- ☆144Updated 2 months ago
- A Quick Reply Set for Sillytavern to gently guide the Model output☆78Updated 8 months ago
- ☆669Updated 3 weeks ago
- Run LLMs on AMD Ryzen™ AI NPUs in minutes. Just like Ollama - but purpose-built and deeply optimized for the AMD NPUs.☆488Updated last week
- A modern, minimalist, and elegant theme for SillyTavern. Inspired by moonlit nights and gentle echoes of serenity.☆234Updated last month
- Reliable model swapping for any local OpenAI/Anthropic compatible server - llama.cpp, vllm, etc☆1,977Updated last week
- An OpenAI API compatible text to speech server using Coqui AI's xtts_v2 and/or piper tts as the backend.☆837Updated 10 months ago