harvardnlp / struct-attn
Code for Structured Attention Networks https://arxiv.org/abs/1702.00887
☆238Updated 8 years ago
Alternatives and similar repositories for struct-attn:
Users that are interested in struct-attn are comparing it to the libraries listed below
- attention model for entailment on SNLI corpus implemented in Tensorflow and Keras☆177Updated 8 years ago
- ☆79Updated 8 years ago
- The implementation of key value memory networks in tensorflow☆248Updated 6 years ago
- Implementation of A Structured Self-attentive Sentence Embedding☆108Updated 6 years ago
- ☆149Updated 2 years ago
- Dynamic evaluation for pytorch language models, now includes hyperparameter tuning☆104Updated 7 years ago
- End-To-End Memory Networks in Theano☆130Updated 2 years ago
- Gated Attention Reader for Text Comprehension☆188Updated 7 years ago
- Tensorflow implementation of Dynamic Coattention Networks for Question Answering.☆100Updated 8 years ago
- Decomposable Attention Model for Sentence Pair Classification (from https://arxiv.org/abs/1606.01933)☆95Updated 8 years ago
- Tensorflow implementation of Recursive Neural Networks using LSTM units☆136Updated 8 years ago
- Mixed Incremental Cross-Entropy REINFORCE ICLR 2016☆331Updated 8 years ago
- Language Modeling☆156Updated 5 years ago
- Top-down Tree LSTM (NAACL 2016) http://aclweb.org/anthology/N/N16/N16-1035.pdf☆83Updated 8 years ago
- This is an implementation of the Attention Sum Reader model as presented in "Text Comprehension with the Attention Sum Reader Network" av…☆98Updated 8 years ago
- End-To-End Memory Network using Tensorflow☆342Updated 8 years ago
- ☆134Updated 7 years ago
- Implementation of Attention-over-Attention Neural Networks for Reading Comprehension (https://arxiv.org/abs/1607.04423) in TensorFlow☆177Updated 8 years ago
- Hierarchical Encoder Decoder for Dialog Modelling☆95Updated 6 years ago
- Code to train state-of-the-art Neural Machine Translation systems.☆105Updated 8 years ago
- in progress☆187Updated 7 years ago
- Intent parsing and slot filling in PyTorch with seq2seq + attention☆159Updated 7 years ago
- nmtpy is a Python framework based on dl4mt-tutorial to experiment with Neural Machine Translation pipelines.☆125Updated 7 years ago
- Query-Reduction Networks (QRN)☆137Updated 7 years ago
- Attention-based NMT with a coverage mechanism to indicate whether a source word is translated or not☆111Updated 5 years ago
- ☆179Updated 6 years ago
- Neural Coref Models☆107Updated 6 years ago
- Load pretrained word embeddings (word2vec, glove format) into torch.FloatTensor for PyTorch☆88Updated 5 years ago
- ☆165Updated 8 years ago
- ☆167Updated 8 years ago