mit-han-lab / llm-awqLinks

[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
3,041Updated 3 weeks ago

Alternatives and similar repositories for llm-awq

Users that are interested in llm-awq are comparing it to the libraries listed below

Sorting: