xlite-dev / Awesome-LLM-InferenceView on GitHub
📚A curated list of Awesome LLM/VLM Inference Papers with Codes: Flash-Attention, Paged-Attention, WINT8/4, Parallelism, etc.🎉
5,130Apr 9, 2026Updated this week

Alternatives and similar repositories for Awesome-LLM-Inference

Users that are interested in Awesome-LLM-Inference are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?