thushan / ollaLinks

Lightweight & fast AI inference proxy for self-hosted LLMs backends like Ollama, LM Studio and others. Designed for speed, simplicity and local-first deployments.
33Updated last week

Alternatives and similar repositories for olla

Users that are interested in olla are comparing it to the libraries listed below

Sorting: