mit-han-lab / llm-awqView on GitHub
[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
3,488Jul 17, 2025Updated 8 months ago

Alternatives and similar repositories for llm-awq

Users that are interested in llm-awq are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?