mostlygeek / llama-swap

HTTP proxy for on-demand model loading with llama.cpp (or other OpenAI compatible backends)
33Updated this week

Related projects

Alternatives and complementary repositories for llama-swap