karelnagel / llama-app

Run LLaMA inference on CPU, with Rust πŸ¦€πŸš€πŸ¦™
β˜†20Updated last year

Related projects β“˜

Alternatives and complementary repositories for llama-app