lukestanley / llm_proxy_cache_stats

Cache requests to OpenAI API and see what requests were made with responses, analytics inspired by Helicone but more simple, currently aimed at development with an easier setup.
β˜†17Updated last year

Related projects β“˜

Alternatives and complementary repositories for llm_proxy_cache_stats