HyperMink / inferenceable

Scalable AI Inference Server for CPU and GPU with Node.js | Utilizes llama.cpp and parts of llamafile C/C++ core under the hood.
14Updated 8 months ago

Alternatives and similar repositories for inferenceable:

Users that are interested in inferenceable are comparing it to the libraries listed below