llm-fireq / fireq
View external linksLinks

FireQ: Fast INT4-FP8 Kernel and RoPE-aware Quantization for LLM Inference Acceleration
20Jun 27, 2025Updated 7 months ago

Alternatives and similar repositories for fireq

Users that are interested in fireq are comparing it to the libraries listed below

Sorting:

Are these results useful?