This is simple code of SpikedAttention (Neurips 2024)
☆22Mar 30, 2025Updated 11 months ago
Alternatives and similar repositories for SpikedAttention
Users that are interested in SpikedAttention are comparing it to the libraries listed below
Sorting:
- Offical implementation of "Scaling Spike-driven Transformer with Efficient Spike Firing Approximation Training" (IEEE T-PAMI2025)☆99May 20, 2025Updated 9 months ago
- Small footprint and configurable HyperBus core☆14Jul 6, 2022Updated 3 years ago
- [ASP-DAC 2025] "NeuronQuant: Accurate and Efficient Post-Training Quantization for Spiking Neural Networks" Official Implementation☆15Mar 6, 2025Updated last year
- The official implementation of HPCA 2025 paper, Prosperity: Accelerating Spiking Neural Networks via Product Sparsity☆37Aug 9, 2025Updated 7 months ago
- A Suite for Parallel Inference of Diffusion Transformers (DiTs) on multi-GPU Clusters☆57Jul 23, 2024Updated last year
- CamJ: an energy modeling and system-level exploration framework for in-sensor visual computing☆24Sep 29, 2023Updated 2 years ago
- Offical implementation of "Quantized Spike-driven Transformer" (ICLR2025)☆34Dec 23, 2025Updated 2 months ago
- Offical implementation of "MetaLA: Unified Optimal Linear Approximation to Softmax Attention Map" (NeurIPS2024 Oral)☆34Jan 18, 2025Updated last year
- SyOPs counter for spiking neural networks