avilum / yalla

A tiny LLM Agent with minimal dependencies, focused on local inference.
50Updated last month

Related projects

Alternatives and complementary repositories for yalla