hybridgroup / yzmaView on GitHub
Go with your own intelligence - Go applications that directly integrate llama.cpp for local inference using hardware acceleration.
442Apr 28, 2026Updated this week

Alternatives and similar repositories for yzma

Users that are interested in yzma are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?