RWKV-APP / RWKV_APPLinks
A fast, lightweight, and extensible RWKV chat UI powered by Flutter. Offline-ready, multi-backend support, ideal for local RWKV inference.
☆37Updated last week
Alternatives and similar repositories for RWKV_APP
Users that are interested in RWKV_APP are comparing it to the libraries listed below
Sorting:
- Local Qwen3 LLM inference. One easy-to-understand file of C source with no dependencies.☆148Updated 5 months ago
- An educational Rust project for exporting and running inference on Qwen3 LLM family☆35Updated 4 months ago
- RWKV-LM-V7(https://github.com/BlinkDL/RWKV-LM) Under Lightning Framework☆49Updated last month
- The DPAB-α Benchmark☆32Updated 10 months ago
- SVGBench: A challenging LLM benchmark that tests knowledge, coding, physical reasoning capabilities of LLMs.☆57Updated last week
- Review/Check GGUF files and estimate the memory usage and maximum tokens per second.☆219Updated 3 months ago
- Lightweight C inference for Qwen3 GGUF. Multiturn prefix caching & batch processing.☆19Updated 3 months ago
- A simple no-install web UI for Ollama and OAI-Compatible APIs!☆31Updated 10 months ago
- ☆94Updated 5 months ago
- Create 3D files in the CLI with Small Language Model☆43Updated last month
- Inference RWKV v7 in pure C.☆42Updated 2 months ago
- A simple, easy-to-customize pipeline for local RAG evaluation. Starter prompts and metric definitions included.☆25Updated last month
- Chat WebUI is an easy-to-use user interface for interacting with AI, and it comes with multiple useful built-in tools such as web search …☆46Updated 3 months ago
- ☆52Updated 2 months ago
- An OpenVoice-based voice cloning tool, single executable file (~14M), supporting multiple formats without dependencies on ffmpeg, Python,…☆39Updated 3 months ago
- LLM inference in C/C++☆103Updated this week
- ☆26Updated 10 months ago
- Implementation of the RWKV language model in pure WebGPU/Rust.☆333Updated last month
- Running Microsoft's BitNet via Electron, React & Astro☆48Updated 2 months ago
- Neo AI integrates into the Linux terminal, capable of executing system commands and providing helpful information.☆122Updated 7 months ago
- Golang web client for Ollama, fast and easy to use.☆30Updated 4 months ago
- AI Assistant☆20Updated 7 months ago
- TTS support with GGML☆197Updated 2 months ago
- A Field-Theoretic Approach to Unbounded Memory in Large Language Models☆19Updated 7 months ago
- ☆16Updated 7 months ago
- Wraps any OpenAI API interface as Responses with MCPs support so it supports Codex. Adding any missing stateful features. Ollama and Vllm…☆137Updated last month
- A Pure Rust based LLM (Any LLM based MLLM such as Spark-TTS) Inference Engine, powering by Candle framework.☆211Updated last month
- The hearth of The Pulsar App, fast, secure and shared inference with modern UI☆59Updated last year
- 33B Chinese LLM, DPO QLORA, 100K context, AirLLM 70B inference with single 4GB GPU☆13Updated last year
- Verify Precision of all Kimi K2 API Vendor☆461Updated 3 weeks ago