xlite-dev / LLM-InfraLinks

πŸ“šA curated list of Awesome LLM/VLM Inference Papers with Codes: Flash-Attention, Paged-Attention, WINT8/4, Parallelism, etc.
β˜†4,142Updated this week

Alternatives and similar repositories for LLM-Infra

Users that are interested in LLM-Infra are comparing it to the libraries listed below

Sorting: