b-etienne / Seq2seq-PyTorch
☆76Updated 5 years ago
Alternatives and similar repositories for Seq2seq-PyTorch:
Users that are interested in Seq2seq-PyTorch are comparing it to the libraries listed below
- Using Pytorch's nn.Transformer module to create an english to french neural machine translation model.☆77Updated 4 years ago
- An LSTM in PyTorch with best practices (weight dropout, forget bias, etc.) built-in. Fully compatible with PyTorch LSTM.☆132Updated 5 years ago
- Sequence to Sequence Models in PyTorch☆44Updated 7 months ago
- Pytorch implementation of R-Transformer. Some parts of the code are adapted from the implementation of TCN and Transformer.☆228Updated 5 years ago
- The Annotated Encoder Decoder with Attention☆166Updated 4 years ago
- PyTorch DataLoader for seq2seq☆84Updated 6 years ago
- PyTorch implementation for Seq2Seq model with attention and Greedy Search / Beam Search for neural machine translation☆58Updated 3 years ago
- A PyTorch implementation of Transformer in "Attention is All You Need"☆104Updated 4 years ago
- Minimal RNN classifier with self-attention in Pytorch☆150Updated 3 years ago
- PyTorch implementation of BERT in "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"☆103Updated 6 years ago
- Fine-tune transformers with pytorch-lightning☆44Updated 3 years ago
- Encoding position with the word embeddings.☆82Updated 6 years ago
- PyTorch implementation of beam search decoding for seq2seq models☆337Updated 2 years ago
- Minimal tutorial on packing and unpacking sequences in pytorch☆210Updated 6 years ago
- This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention…☆125Updated 3 years ago
- ☆216Updated 4 years ago