turboderp-org / exllamav2View on GitHub
A fast inference library for running LLMs locally on modern consumer-class GPUs
4,476Mar 4, 2026Updated 3 weeks ago

Alternatives and similar repositories for exllamav2

Users that are interested in exllamav2 are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?