xcena-dev / maruView on GitHub
High-Performance KV Cache Storage Engine on CXL Shared Memory for LLM Inference
45Mar 19, 2026Updated this week

Alternatives and similar repositories for maru

Users that are interested in maru are comparing it to the libraries listed below

Sorting:

Are these results useful?