bonanyan / attentionlego
Attentionlego
☆12Updated last year
Alternatives and similar repositories for attentionlego:
Users that are interested in attentionlego are comparing it to the libraries listed below
- Accelerate multihead attention transformer model using HLS for FPGA☆11Updated last year
- Open-source of MSD framework☆16Updated last year
- ☆12Updated last year
- ☆26Updated last month
- Multi-core HW accelerator mapping optimization framework for layer-fused ML workloads.☆51Updated 2 weeks ago
- ☆15Updated last year
- Collection of kernel accelerators optimised for LLM execution☆17Updated last month
- ☆29Updated 3 years ago
- A bit-level sparsity-awared multiply-accumulate process element.☆15Updated 10 months ago
- C++ code for HLS FPGA implementation of transformer☆16Updated 8 months ago
- [TCAD'23] AccelTran: A Sparsity-Aware Accelerator for Transformers☆40Updated last year
- [ASPLOS 2024] CIM-MLC: A Multi-level Compilation Stack for Computing-In-Memory Accelerators☆30Updated 11 months ago
- An efficient spatial accelerator enabling hybrid sparse attention mechanisms for long sequences