anordin95 / run-llama-locally

Run and explore Llama models locally with minimal dependencies on CPU
190Updated 3 months ago

Alternatives and similar repositories for run-llama-locally:

Users that are interested in run-llama-locally are comparing it to the libraries listed below