lukestanley / llm_proxy_cache_stats

Cache requests to OpenAI API and see what requests were made with responses, analytics inspired by Helicone but more simple, currently aimed at development with an easier setup.
β˜†17Updated last year

Alternatives and similar repositories for llm_proxy_cache_stats:

Users that are interested in llm_proxy_cache_stats are comparing it to the libraries listed below