srush / annotated-s4Links
Implementation of https://srush.github.io/annotated-s4
☆512Updated 7 months ago
Alternatives and similar repositories for annotated-s4
Users that are interested in annotated-s4 are comparing it to the libraries listed below
Sorting:
- ☆314Updated last year
- Annotated version of the Mamba paper☆495Updated last year
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆693Updated last week
- ☆367Updated last year
- ☆289Updated last year
- Long Range Arena for Benchmarking Efficient Transformers☆776Updated 2 years ago
- ☆194Updated last year
- Language Modeling with the H3 State Space Model☆522Updated 2 years ago
- For optimization algorithm research and development.☆558Updated 3 weeks ago
- ☆259Updated 7 months ago
- Accelerated First Order Parallel Associative Scan☆196Updated 3 weeks ago
- Code for our NeurIPS 2022 paper☆371Updated 3 years ago
- Named tensors with first-class dimensions for PyTorch☆332Updated 2 years ago
- Neural Networks and the Chomsky Hierarchy☆212Updated last year
- ☆163Updated 3 years ago
- Puzzles for exploring transformers☆384Updated 2 years ago
- maximal update parametrization (µP)☆1,672Updated last year
- JAX Synergistic Memory Inspector☆184Updated last year
- Sequence modeling with Mega.☆303Updated 3 years ago
- Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation precondition…☆191Updated 3 weeks ago
- Simple, minimal implementation of the Mamba SSM in one pytorch file. Using logcumsumexp (Heisen sequence).☆130Updated last year
- ☆490Updated last year
- Efficient optimizers☆281Updated last month
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆406Updated this week
- CIFAR-10 speedruns: 94% in 2.6 seconds and 96% in 27 seconds☆349Updated 2 months ago
- Code release for "Git Re-Basin: Merging Models modulo Permutation Symmetries"☆502Updated 2 years ago
- ☆246Updated last year
- Convolutions for Sequence Modeling☆910Updated last year
- Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch☆378Updated last year
- ☆215Updated last year