RakeshUIUC / multihead_attn_acceleratorView on GitHub
Accelerate multihead attention transformer model using HLS for FPGA
11Dec 7, 2023Updated 2 years ago

Alternatives and similar repositories for multihead_attn_accelerator

Users that are interested in multihead_attn_accelerator are comparing it to the libraries listed below

Sorting:

Are these results useful?