hybridgroup / yzmaView on GitHub
Go with your own intelligence - Go applications that directly integrate llama.cpp for local inference using hardware acceleration.
367Mar 24, 2026Updated 2 weeks ago

Alternatives and similar repositories for yzma

Users that are interested in yzma are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?