karelnagel / llama-app

Run LLaMA inference on CPU, with Rust πŸ¦€πŸš€πŸ¦™
β˜†22Updated last year

Alternatives and similar repositories for llama-app:

Users that are interested in llama-app are comparing it to the libraries listed below