jundaf2 / INT8-Flash-Attention-FMHA-QuantizationLinks

156Updated last year

Alternatives and similar repositories for INT8-Flash-Attention-FMHA-Quantization

Users that are interested in INT8-Flash-Attention-FMHA-Quantization are comparing it to the libraries listed below

Sorting: