mit-han-lab / llm-awq

[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
2,498Updated 3 weeks ago

Related projects

Alternatives and complementary repositories for llm-awq