OpenNLPLab / ETSC-Exact-Toeplitz-to-SSM-ConversionView external linksLinks
[EMNLP 2023] Official implementation of the algorithm ETSC: Exact Toeplitz-to-SSM Conversion our EMNLP 2023 paper - Accelerating Toeplitz Neural Network with Constant-time Inference Complexity
☆14Oct 17, 2023Updated 2 years ago
Alternatives and similar repositories for ETSC-Exact-Toeplitz-to-SSM-Conversion
Users that are interested in ETSC-Exact-Toeplitz-to-SSM-Conversion are comparing it to the libraries listed below
Sorting:
- ☆20Apr 17, 2023Updated 2 years ago
- ☆11Oct 11, 2023Updated 2 years ago
- [ICLR 2023] Official implementation of Transnormer in our ICLR 2023 paper - Toeplitz Neural Network for Sequence Modeling☆81Apr 24, 2024Updated last year
- [EMNLP 2022] Official implementation of Transnormer in our EMNLP 2022 paper - The Devil in Linear Transformer☆64Jul 30, 2023Updated 2 years ago
- Fine-Tuning Pre-trained Transformers into Decaying Fast Weights☆19Oct 9, 2022Updated 3 years ago
- ☆29Jul 9, 2024Updated last year
- ☆16Dec 19, 2024Updated last year
- Awesome Triton Resources☆39Apr 27, 2025Updated 9 months ago
- ☆20May 30, 2024Updated last year
- AGaLiTe: Approximate Gated Linear Transformers for Online Reinforcement Learning (Published in TMLR)☆23Oct 15, 2024Updated last year
- ☆19Dec 4, 2025Updated 2 months ago
- Official code for the paper "Attention as a Hypernetwork"☆47Jun 22, 2024Updated last year
- ☆22Dec 15, 2023Updated 2 years ago
- Statistical discontinuous constituent parsing☆11Feb 15, 2018Updated 7 years ago
- RADLADS training code☆36May 7, 2025Updated 9 months ago
- Reference implementation of "Softmax Attention with Constant Cost per Token" (Heinsen, 2024)☆24Jun 6, 2024Updated last year
- A method for evaluating the high-level coherence of machine-generated texts. Identifies high-level coherence issues in transformer-based …☆11Mar 18, 2023Updated 2 years ago
- PyTorch implementation for PaLM: A Hybrid Parser and Language Model.☆10Jan 7, 2020Updated 6 years ago
- ☆12Jan 29, 2021Updated 5 years ago
- Repository for SPECTRA: Sparse Structured Text Rationalization, accepted at EMNLP 2021 main conference.☆10Feb 14, 2024Updated last year
- source code for NAACL2022 main conference "Dynamic Programming in Rank Space: Scaling Structured Inference with Low-Rank HMMs and PCFGs"☆10Sep 26, 2022Updated 3 years ago
- Advanced Formal Language Theory (263-5352-00L; Frühjahr 2023)☆10Feb 21, 2023Updated 2 years ago
- Source code for "N-ary Constituent Tree Parsing with Recursive Semi-Markov Model" published at ACL 2021☆10May 27, 2021Updated 4 years ago
- ☆11Oct 13, 2019Updated 6 years ago
- Implementation of Hyena Hierarchy in JAX☆10Apr 30, 2023Updated 2 years ago
- Checkpointable dataset utilities for foundation model training☆32Jan 29, 2024Updated 2 years ago
- Experiments on the impact of depth in transformers and SSMs.☆40Oct 23, 2025Updated 3 months ago
- APPy (Annotated Parallelism for Python) enables users to annotate loops and tensor expressions in Python with compiler directives akin to…☆30Jan 28, 2026Updated 2 weeks ago
- Open-sourcing code associated with the AAAI-25 paper "On the Expressiveness and Length Generalization of Selective State-Space Models on …☆14Sep 18, 2025Updated 4 months ago
- Implementation of Cascaded Head-colliding Attention (ACL'2021)☆11Sep 16, 2021Updated 4 years ago
- Recursive Bayesian Networks☆11May 11, 2025Updated 9 months ago
- Official Repository for "Modeling Hierarchical Structures with Continuous Recursive Neural Networks" (ICML 2021)☆11Aug 18, 2021Updated 4 years ago
- Code for the paper "Stack Attention: Improving the Ability of Transformers to Model Hierarchical Patterns"☆18Mar 15, 2024Updated last year
- ☆12Mar 7, 2022Updated 3 years ago
- ☆14Feb 1, 2024Updated 2 years ago
- Code for the paper "Understanding the Mechanics of SPIGOT: Surrogate Gradients for Latent Structure Learning"☆11May 5, 2021Updated 4 years ago
- ☆12Dec 13, 2022Updated 3 years ago
- ☆13Dec 15, 2025Updated last month
- Code for "Discovering Non-monotonic Autoregressive Orderings with Variational Inference" (paper and code updated from ICLR 2021)☆12Mar 7, 2024Updated last year