hybridgroup / yzmaView on GitHub
Go with your own intelligence - Go applications that directly integrate llama.cpp for local inference using hardware acceleration.
340Feb 25, 2026Updated last week

Alternatives and similar repositories for yzma

Users that are interested in yzma are comparing it to the libraries listed below

Sorting:

Are these results useful?