onehr / llama-rs

Run LLaMA inference on CPU, with Rust 🦀🚀🦙
☆18Updated last week

Alternatives and similar repositories for llama-rs:

Users that are interested in llama-rs are comparing it to the libraries listed below