harvardnlp / seq2seq-attnLinks
Sequence-to-sequence model with LSTM encoder/decoders and attention
☆1,277Updated 4 years ago
Alternatives and similar repositories for seq2seq-attn
Users that are interested in seq2seq-attn are comparing it to the libraries listed below
Sorting:
- LSTM language model with CNN over characters☆830Updated 8 years ago
- ☆617Updated 8 years ago
- Library for implementing RNNs with Theano☆604Updated 9 years ago
- Tree-structured Long Short-Term Memory networks (http://arxiv.org/abs/1503.00075)☆888Updated 8 years ago
- "End-To-End Memory Networks" in Tensorflow☆827Updated 8 years ago
- ☆834Updated 8 years ago
- in progress☆784Updated 7 years ago
- ☆675Updated 6 years ago
- Neural Attention Model for Abstractive Summarization☆918Updated 7 years ago
- Sequence to Sequence Models with PyTorch☆738Updated 3 years ago
- Dynamic seq2seq in TensorFlow, step by step☆996Updated 7 years ago
- Memory Networks implementations☆1,757Updated 5 years ago
- Sent2Vec encoder and training code from the paper "Skip-Thought Vectors"☆2,051Updated 5 years ago
- Some language modeling tools for Keras☆659Updated 7 years ago
- Question answering dataset featured in "Teaching Machines to Read and Comprehend☆1,297Updated 8 years ago
- Language Model GRU with Python and Theano☆501Updated 9 years ago
- Open-Source Neural Machine Translation in Tensorflow☆799Updated 2 years ago
- Recurrent Neural Network library for Torch7's nn☆943Updated 7 years ago
- Sequence to sequence learning using TensorFlow.☆389Updated 7 years ago
- Attention-based sequence to sequence learning