xlite-dev / Awesome-LLM-InferenceView on GitHub
📚A curated list of Awesome LLM/VLM Inference Papers with Codes: Flash-Attention, Paged-Attention, WINT8/4, Parallelism, etc.🎉
5,022Updated this week

Alternatives and similar repositories for Awesome-LLM-Inference

Users that are interested in Awesome-LLM-Inference are comparing it to the libraries listed below

Sorting:

Are these results useful?