hybridgroup / yzmaLinks

Write Go applications that directly integrate llama.cpp for local inference using hardware acceleration.
201Updated last week

Alternatives and similar repositories for yzma

Users that are interested in yzma are comparing it to the libraries listed below

Sorting: