turboderp-org / exllamav2View on GitHub
A fast inference library for running LLMs locally on modern consumer-class GPUs
4,444Dec 9, 2025Updated 2 months ago

Alternatives and similar repositories for exllamav2

Users that are interested in exllamav2 are comparing it to the libraries listed below

Sorting:

Are these results useful?