clab / lstm-parser
Transition-based dependency parser based on stack LSTMs
☆205Updated 5 years ago
Alternatives and similar repositories for lstm-parser:
Users that are interested in lstm-parser are comparing it to the libraries listed below
- Neural Coref Models☆107Updated 6 years ago
- Graph-based and Transition-based dependency parsers based on BiLSTMs☆275Updated 7 years ago
- NLP tools on Lasagne☆61Updated 7 years ago
- Recurrent neural network grammars☆187Updated 6 years ago
- SPINN (Stack-augmented Parser-Interpreter Neural Network): fast, batchable, context-aware TreeRNNs☆206Updated 7 years ago
- A multilingual dependency parser based on linear programming relaxations.☆115Updated 6 years ago
- This is an implementation of the Attention Sum Reader model as presented in "Text Comprehension with the Attention Sum Reader Network" av…☆98Updated 8 years ago
- Extension of the original word2vec using different architectures☆210Updated 8 years ago
- ☆58Updated 9 years ago
- Python code for training all models in the ICLR paper, "Towards Universal Paraphrastic Sentence Embeddings". These models achieve strong …☆192Updated 9 years ago
- ☆54Updated 9 years ago
- BiCVM Code☆45Updated 6 years ago
- Decomposable Attention Model for Sentence Pair Classification (from https://arxiv.org/abs/1606.01933)☆95Updated 8 years ago
- A Multilingual and Multilevel Representation Learning Toolkit for NLP☆116Updated 7 years ago
- End-To-End Memory Networks in Theano☆130Updated 2 years ago
- Attention-based NMT with a coverage mechanism to indicate whether a source word is translated or not☆111Updated 4 years ago
- Code to train state-of-the-art Neural Machine Translation systems.☆105Updated 8 years ago
- Attention-based NMT with Coverage, Context Gate, and Reconstruction☆94Updated 4 years ago
- Hierarchical Encoder Decoder for Dialog Modelling☆95Updated 6 years ago
- Open-source implementation of the BilBOWA (Bilingual Bag-of-Words without Alignments) word embedding model.☆69Updated 3 years ago
- Code to train and use models from "Charagram: Embedding Words and Sentences via Character n-grams".☆124Updated 8 years ago
- A suite of representation learning models for sentence embedding, and some tasks to evaluate them on.☆83Updated 6 years ago
- GraphParser is a semantic parser which can convert natural language sentences to logical forms and graphs.☆123Updated 7 years ago
- Code for Structured Attention Networks https://arxiv.org/abs/1702.00887☆238Updated 8 years ago
- Bidirectional Long-Short Term Memory tagger (bi-LSTM) (in DyNet) -- hierarchical (with word and character embeddings)☆122Updated last year
- Neural Attention Model for Abstractive Summarization☆73Updated 9 years ago
- Gated Attention Reader for Text Comprehension☆188Updated 7 years ago
- A neural TurboSemanticParser as described in "Deep Multitask Learning for Semantic Dependency Parsing", Peng et al., ACL 2017.☆70Updated 4 years ago
- Molding CNNs for text (http://arxiv.org/abs/1508.04112)☆85Updated 8 years ago
- Graph-based Dependency Parser☆46Updated 9 years ago