yangsaiyong / tf-adaptive-softmax-lstm-lm
The experiment result of LSTM language models on PTB (Penn Treebank) and GBW (Google Billion Word) using AdaptiveSoftmax on TensorFlow.
☆100Updated 6 years ago
Alternatives and similar repositories for tf-adaptive-softmax-lstm-lm:
Users that are interested in tf-adaptive-softmax-lstm-lm are comparing it to the libraries listed below
- Implementation of Attention-over-Attention Neural Networks for Reading Comprehension (https://arxiv.org/abs/1607.04423) in TensorFlow☆177Updated 8 years ago
- souce code for "Accelerating Neural Transformer via an Average Attention Network"☆78Updated 5 years ago
- Attention-based NMT with a coverage mechanism to indicate whether a source word is translated or not☆111Updated 5 years ago
- An attempt to implement the TreeLSTM in Theano☆44Updated 9 years ago
- ☆149Updated 2 years ago
- An implementation of RNNsearch using TensorFlow☆67Updated 7 years ago
- ☆42Updated 6 years ago
- Simple Tensorflow Implementation of "A Structured Self-attentive Sentence Embedding" (ICLR 2017)☆91Updated 6 years ago
- Training RNNs as fast as CNNs. An unofficial tensorflow implementation.