messkan / prompt-cacheLinks

Cut LLM costs by up to 80% and unlock sub-millisecond responses with intelligent semantic caching. A drop-in OpenAI-compatible proxy written in Go.
34Updated this week

Alternatives and similar repositories for prompt-cache

Users that are interested in prompt-cache are comparing it to the libraries listed below

Sorting: