devanmolsharma / cachelmView on GitHub
A semantic caching layer for LLM apps. It’s meant to cut down on repeated API calls even when the user phrases things differently
14Jul 3, 2025Updated 8 months ago

Alternatives and similar repositories for cachelm

Users that are interested in cachelm are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?