ardanlabs / kronkView on GitHub
Your personal engine for running open source models locally. Use Go for hardware accelerated local inference with llama.cpp directly integrated into your Go applications via the yzma module. Kronk provides a high-level API that feels similar to using an OpenAI compatible API. Kronk also provides a model server to run local work
193Mar 1, 2026Updated last week

Alternatives and similar repositories for kronk

Users that are interested in kronk are comparing it to the libraries listed below

Sorting:

Are these results useful?