jundaf2 / INT8-Flash-Attention-FMHA-Quantization

157Updated last year

Alternatives and similar repositories for INT8-Flash-Attention-FMHA-Quantization:

Users that are interested in INT8-Flash-Attention-FMHA-Quantization are comparing it to the libraries listed below