MurtyShikhar / Pushdown-LayersLinks
Code for Pushdown Layers from our EMNLP 2023 paper
☆29Updated 2 years ago
Alternatives and similar repositories for Pushdown-Layers
Users that are interested in Pushdown-Layers are comparing it to the libraries listed below
Sorting:
- ☆31Updated 2 years ago
- The official repository for our paper "The Neural Data Router: Adaptive Control Flow in Transformers Improves Systematic Generalization".☆34Updated 7 months ago
- source code of NAACL2021 "PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols“ and ACL2021 main conferenc…☆51Updated 10 months ago
- Code for "Discovering Non-monotonic Autoregressive Orderings with Variational Inference" (paper and code updated from ICLR 2021)☆12Updated last year
- ☆51Updated 2 years ago
- ☆12Updated 5 years ago
- Official Implementation of ACL2023: Don't Parse, Choose Spans! Continuous and Discontinuous Constituency Parsing via Autoregressive Span …☆14Updated 2 years ago
- lanmt ebm☆12Updated 5 years ago
- Stick-breaking attention☆62Updated 6 months ago
- Advanced Formal Language Theory (263-5352-00L; Frühjahr 2023)☆10Updated 2 years ago
- Efficient Transformers with Dynamic Token Pooling☆67Updated 2 years ago
- A probabilitic model for contextual word representation. Accepted to ACL2023 Findings.☆25Updated 2 years ago
- ☆52Updated 3 years ago
- Learning to Model Editing Processes☆26Updated 5 months ago
- ☆22Updated 4 years ago
- ☆20Updated 2 years ago
- ☆29Updated 4 years ago
- Code for the ACL 2021 paper "Structural Guidance for Transformer Language Models"☆13Updated 4 months ago
- Experiments on the impact of depth in transformers and SSMs.☆40Updated 3 months ago
- ☆13Updated 2 years ago
- ☆45Updated 4 years ago
- Code for "Does syntax need to grow on trees? Sources of inductive bias in sequence to sequence networks"☆24Updated 6 years ago
- Repo for ICML23 "Why do Nearest Neighbor Language Models Work?"☆59Updated 3 years ago
- ☆11Updated last year
- ☆15Updated last year
- ☆12Updated 3 years ago
- A method for evaluating the high-level coherence of machine-generated texts. Identifies high-level coherence issues in transformer-based …☆11Updated 2 years ago
- ☆22Updated 3 years ago
- Fine-Tuning Pre-trained Transformers into Decaying Fast Weights☆19Updated 3 years ago
- ☆57Updated last year