Implementation of MambaByte in "MambaByte: Token-free Selective State Space Model" in Pytorch and Zeta
☆125Feb 6, 2026Updated last month
Alternatives and similar repositories for MambaByte
Users that are interested in MambaByte are comparing it to the libraries listed below
Sorting:
- Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Ze…☆122Jan 31, 2026Updated last month
- [CoLM 24] Official Repository of MambaByte: Token-free Selective State Space Model☆24Oct 12, 2024Updated last year
- Implementation of the Mamba SSM with hf_integration.☆55Aug 31, 2024Updated last year
- [ICLR'25] "Understanding Bottlenecks of State Space Models through the Lens of Recency and Over-smoothing" by Peihao Wang, Ruisi Cai, Yue…☆17Mar 21, 2025Updated last year
- Mamba R1 represents a novel architecture that combines the efficiency of Mamba's state space models with the scalability of Mixture of Ex…☆25Oct 13, 2025Updated 5 months ago
- Implementation of a Hierarchical Mamba as described in the paper: "Hierarchical State Space Models for Continuous Sequence-to-Sequence Mo…☆15Nov 11, 2024Updated last year
- Open-sourcing code associated with the AAAI-25 paper "On the Expressiveness and Length Generalization of Selective State-Space Models on …☆16Sep 18, 2025Updated 6 months ago
- Efficient PScan implementation in PyTorch☆17Jan 2, 2024Updated 2 years ago
- Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling☆217Jan 30, 2026Updated last month
- Code for the paper "Stack Attention: Improving the Ability of Transformers to Model Hierarchical Patterns"☆18Mar 15, 2024Updated 2 years ago
- Code for the paper: https://arxiv.org/pdf/2309.06979.pdf☆21Jul 29, 2024Updated last year
- A byte-level decoder architecture that matches the performance of tokenized Transformers.☆67Apr 24, 2024Updated last year
- Some preliminary explorations of Mamba's context scaling.☆218Feb 8, 2024Updated 2 years ago
- PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"☆210Mar 13, 2026Updated last week
- Modified Mamba code to run on CPU☆30Jan 14, 2024Updated 2 years ago
- A repository for DenseSSMs☆90Apr 11, 2024Updated last year
- A simpler Pytorch + Zeta Implementation of the paper: "SiMBA: Simplified Mamba-based Architecture for Vision and Multivariate Time series…☆28Nov 11, 2024Updated last year
- Implementation of a modular, high-performance, and simplistic mamba for high-speed applications☆40Nov 11, 2024Updated last year
- TUI application for viewing the status of GPU allocations on a Slurm cluster☆11Dec 11, 2023Updated 2 years ago
- Build high-performance AI models with modular building blocks