mit-han-lab / llm-awqView external linksLinks
[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
☆3,436Jul 17, 2025Updated 6 months ago
Alternatives and similar repositories for llm-awq
Users that are interested in llm-awq are comparing it to the libraries listed below
Sorting: