castorini / VDPWI-NN-Torch
Very Deep Pairwise Word Interaction Neural Networks for modeling textual similarity (He and Lin, NAACL/HLT 2016)
☆19Updated 6 years ago
Related projects ⓘ
Alternatives and complementary repositories for VDPWI-NN-Torch
- ☆24Updated 7 years ago
- Document context language models☆22Updated 9 years ago
- Recurrent Memory Network - Torch implementation☆36Updated 8 years ago
- ☆28Updated 8 years ago
- An attentional NMT model in Dynet☆26Updated 5 years ago
- ☆20Updated 6 years ago
- ☆26Updated 8 years ago
- Author implementation of "Learning Recurrent Span Representations for Extractive Question Answering" (Lee et al. 2016)☆33Updated 7 years ago
- ☆36Updated 7 years ago
- ☆46Updated 7 years ago
- Cross-lingual Dependency Parsing Based on Distributed Representations☆20Updated 6 years ago
- A latent variable RNN model for discourse-driven language modeling☆36Updated 8 years ago
- ☆44Updated 6 years ago
- An aspiring attempt to generate a continuous space of sentences with DenseNet☆27Updated 7 years ago
- Top-down Tree LSTM (NAACL 2016) http://aclweb.org/anthology/N/N16/N16-1035.pdf☆84Updated 7 years ago
- ☆18Updated 8 years ago
- Towards cross-lingual distributed representations without parallel text trained with adversarial autoencoders☆22Updated 8 years ago
- NLP tools on Lasagne☆61Updated 7 years ago
- Unofficial implementation algorithms of attention models on SNLI dataset☆34Updated 6 years ago
- ☆30Updated 6 years ago
- Code of NAACL paper "Unsupervised Multi-Domain Adaptation with Feature Embeddings"☆33Updated 9 years ago
- End-to-end memory networks in TensorFlow☆34Updated 7 years ago
- Environment that can be used to evaluate reasoning capabilities of artificial agents☆27Updated 7 years ago
- ☆46Updated 6 years ago
- ACL2015_code_Gated Recursive Neural Network for Chinese Word Segmentation☆28Updated 8 years ago
- ☆24Updated 8 years ago
- Moro files for the ACL 2015 Tutorial on Matrix and Tensor Factorization Methods for Natural Language Processing☆20Updated 9 years ago
- ☆21Updated 6 years ago