ardanlabs / kronk
View external linksLinks

This project lets you use Go for hardware accelerated local inference with llama.cpp directly integrated into your applications via the yzma module. Kronk provides a high-level API that feels similar to using an OpenAI compatible API. Kronk also provide a model server to run local workloads.
163Updated this week

Alternatives and similar repositories for kronk

Users that are interested in kronk are comparing it to the libraries listed below

Sorting:

Are these results useful?