shyamupa / snli-entailment
attention model for entailment on SNLI corpus implemented in Tensorflow and Keras
☆177Updated 8 years ago
Alternatives and similar repositories for snli-entailment:
Users that are interested in snli-entailment are comparing it to the libraries listed below
- Dynamic Convolutional Neural Networks for Theano/Lasagne☆151Updated 7 years ago
- Tensorflow implementation of Recursive Neural Networks using LSTM units☆136Updated 8 years ago
- Hierarchical encoder-decoder framework for sequences of words, sentences, paragraphs and documents using LSTM and GRU in Theano☆108Updated 8 years ago
- Tensorflow implementation of Dynamic Coattention Networks for Question Answering.☆100Updated 8 years ago
- End-To-End Memory Networks in Theano☆130Updated 2 years ago
- ☆79Updated 8 years ago
- Multi-Perspective Convolutional Neural Networks for modeling textual similarity (He et al., EMNLP 2015)☆106Updated 6 years ago
- Gated Attention Reader for Text Comprehension☆188Updated 7 years ago
- Neural Attention Model for Abstractive Summarization☆73Updated 9 years ago
- Character-Aware Neural Language Models. A keras-based implementation☆118Updated 3 years ago
- ☆149Updated 2 years ago
- LSTM-CRF models for sequence labeling in text.☆174Updated 7 years ago
- ☆143Updated 7 years ago
- Implementation of A Structured Self-attentive Sentence Embedding☆108Updated 6 years ago
- Python code for training all models in the ICLR paper, "Towards Universal Paraphrastic Sentence Embeddings". These models achieve strong …☆192Updated 9 years ago
- Molding CNNs for text (http://arxiv.org/abs/1508.04112)☆85Updated 8 years ago
- in progress☆187Updated 7 years ago
- ☆218Updated 9 years ago
- Code to train state-of-the-art Neural Machine Translation systems.☆105Updated 8 years ago
- This is an implementation of the Attention Sum Reader model as presented in "Text Comprehension with the Attention Sum Reader Network" av…☆98Updated 8 years ago
- ☆167Updated 8 years ago
- NLP tools on Lasagne☆61Updated 7 years ago
- Simple Keras model that tackles the Stanford Natural Language Inference (SNLI) corpus using summation and/or recurrent neural networks☆264Updated 8 years ago
- ☆121Updated 8 years ago
- Code for Structured Attention Networks https://arxiv.org/abs/1702.00887☆237Updated 7 years ago
- ☆165Updated 8 years ago
- nmtpy is a Python framework based on dl4mt-tutorial to experiment with Neural Machine Translation pipelines.☆125Updated 7 years ago
- ByteNet for character-level language modelling☆318Updated 7 years ago
- Implementation of Attention-over-Attention Neural Networks for Reading Comprehension (https://arxiv.org/abs/1607.04423) in TensorFlow☆177Updated 8 years ago
- Top-down Tree LSTM (NAACL 2016) http://aclweb.org/anthology/N/N16/N16-1035.pdf☆83Updated 8 years ago