akikoe / tree2seqLinks
C++ code of "Tree-to-Sequence Attentional Neural Machine Translation (tree2seq ANMT)"
☆57Updated 7 years ago
Alternatives and similar repositories for tree2seq
Users that are interested in tree2seq are comparing it to the libraries listed below
Sorting:
- Dynamic Entity Representation (Kobayashi et al., 2016)☆20Updated 8 years ago
- Implementation of "Controlling Output Length in Neural Encoder-Decoders"☆42Updated 7 years ago
- ☆47Updated 8 years ago
- ☆19Updated 6 years ago
- Dependency tree-based RNN☆32Updated 8 years ago
- Graph-based Dependency Parser☆46Updated 9 years ago
- Author implementation of "Learning Recurrent Span Representations for Extractive Question Answering" (Lee et al. 2016)☆33Updated 8 years ago
- Easy-first dependency parser based on Hierarchical Tree LSTMs☆33Updated 8 years ago
- BiCVM Code☆45Updated 7 years ago
- ☆18Updated 7 years ago
- ☆28Updated 9 years ago
- ☆44Updated 7 years ago
- Codebase for Global Neural CCG Parsing with Optimality Guarantees☆25Updated 8 years ago
- ☆50Updated 8 years ago
- An attentional NMT model in Dynet☆26Updated 6 years ago
- modlm: A toolkit for mixture of distributions language models☆27Updated 7 years ago
- Code and workflow for the reproduction of the stochastic decoder experiments.☆15Updated 7 years ago
- Top-down Tree LSTM (NAACL 2016) http://aclweb.org/anthology/N/N16/N16-1035.pdf☆83Updated 8 years ago
- Unofficial implementation algorithms of attention models on SNLI dataset☆33Updated 6 years ago
- ☆19Updated 8 years ago
- Transition-based joint syntactic dependency parser and semantic role labeler using a stack LSTM RNN architecture.☆61Updated 8 years ago
- Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder☆42Updated 7 years ago
- Code for the EACL paper "Using the Output Embedding to Improve Language Models" by Ofir Press and Lior Wolf☆46Updated 7 years ago
- C++ code of "Learning to Parse and Translate Improves Neural Machine Translation"☆21Updated 8 years ago
- Implementation of "Arc-swift: A Novel Transition System for Dependency Parsing"☆32Updated 6 years ago
- An Implementation of Transformer (Attention Is All You Need) in DyNet☆65Updated last year
- Decoding platform for machine translation research☆55Updated 5 years ago
- ☆25Updated 9 years ago
- Recurrent Memory Network - Torch implementation☆35Updated 8 years ago
- Attention-based NMT with a coverage mechanism to indicate whether a source word is translated or not☆111Updated 5 years ago