HyperMink / inferenceableLinks

Scalable AI Inference Server for CPU and GPU with Node.js | Utilizes llama.cpp and parts of llamafile C/C++ core under the hood.
14Updated last year

Alternatives and similar repositories for inferenceable

Users that are interested in inferenceable are comparing it to the libraries listed below

Sorting: