☆52Jan 28, 2024Updated 2 years ago
Alternatives and similar repositories for mamba-triton
Users that are interested in mamba-triton are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Official Implementation of ACL2023: Don't Parse, Choose Spans! Continuous and Discontinuous Constituency Parsing via Autoregressive Span …☆14Aug 25, 2023Updated 2 years ago
- Blog post☆17Feb 16, 2024Updated 2 years ago
- Accelerated First Order Parallel Associative Scan☆197Jan 7, 2026Updated 4 months ago
- ☆107Mar 9, 2024Updated 2 years ago
- Here we will test various linear attention designs.☆62Apr 25, 2024Updated 2 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- Source-to-Source Debuggable Derivatives in Pure Python☆15Jan 23, 2024Updated 2 years ago
- Griffin MQA + Hawk Linear RNN Hybrid☆89Apr 13, 2026Updated 3 weeks ago
- continous batching and parallel acceleration for RWKV6☆22Jun 28, 2024Updated last year
- [NeurIPS 2023 spotlight] Official implementation of HGRN in our NeurIPS 2023 paper - Hierarchically Gated Recurrent Neural Network for Se…☆68Apr 24, 2024Updated 2 years ago
- ☆22Dec 15, 2023Updated 2 years ago
- Jax implementation of "Griffin: Mixing Gated Linear Recurrences with Local Attention for Efficient Language Models"☆15May 10, 2024Updated last year
- Official PyTorch Implementation of the Longhorn Deep State Space Model☆57Dec 4, 2024Updated last year
- HGRN2: Gated Linear RNNs with State Expansion☆57Aug 20, 2024Updated last year
- Experiment of using Tangent to autodiff triton☆82Jan 22, 2024Updated 2 years ago
- GPUs on demand by Runpod - Special Offer Available • AdRun AI, ML, and HPC workloads on powerful cloud GPUs—without limits or wasted spend. Deploy GPUs in under a minute and pay by the second.
- Code for the ACL 2021 paper "Structural Guidance for Transformer Language Models"☆14Sep 17, 2025Updated 7 months ago
- Engineering the state of RNN language models (Mamba, RWKV, etc.)☆32May 25, 2024Updated last year
- ☆32Jan 7, 2024Updated 2 years ago
- ☆13Feb 7, 2023Updated 3 years ago
- ☆17Dec 19, 2024Updated last year
- Official Repository for Efficient Linear-Time Attention Transformers.☆18Jun 2, 2024Updated last year
- An annotated implementation of the Hyena Hierarchy paper☆34May 28, 2023Updated 2 years ago
- Gradient-based Hyperparameter Optimization Over Long Horizons☆14Sep 29, 2021Updated 4 years ago
- [NeurIPS 2023] Sparse Modular Activation for Efficient Sequence Modeling☆40Dec 2, 2023Updated 2 years ago
- Wordpress hosting with auto-scaling - Free Trial Offer • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Stick-breaking attention☆63Jul 1, 2025Updated 10 months ago
- Mamba training library developed by kotoba technologies☆71Feb 11, 2024Updated 2 years ago
- ☆124May 28, 2024Updated last year
- My Implementation of Q-Sparse: All Large Language Models can be Fully Sparsely-Activated☆35Aug 14, 2024Updated last year
- ☆26Feb 26, 2026Updated 2 months ago
- Understand and test language model architectures on synthetic tasks.☆265Mar 22, 2026Updated last month
- ☆36Nov 22, 2024Updated last year
- ☆45Nov 1, 2025Updated 6 months ago
- Open-sourcing code associated with the AAAI-25 paper "On the Expressiveness and Length Generalization of Selective State-Space Models on …☆16Sep 18, 2025Updated 7 months ago
- GPU virtual machines on DigitalOcean Gradient AI • AdGet to production fast with high-performance AMD and NVIDIA GPUs you can spin up in seconds. The definition of operational simplicity.
- Advanced Formal Language Theory (263-5352-00L; Frühjahr 2023)☆10Feb 21, 2023Updated 3 years ago
- Code for the paper: https://arxiv.org/pdf/2309.06979.pdf☆21Jul 29, 2024Updated last year
- ☆33May 26, 2024Updated last year
- ☆37Feb 26, 2024Updated 2 years ago
- ☆18Mar 10, 2023Updated 3 years ago