HyperMink / inferenceable

Scalable AI Inference Server for CPU and GPU with Node.js | Utilizes llama.cpp and parts of llamafile C/C++ core under the hood.
14Updated 5 months ago

Related projects

Alternatives and complementary repositories for inferenceable