lilianweng / transformer-tensorflow
Implementation of Transformer Model in Tensorflow
☆466Updated last year
Alternatives and similar repositories for transformer-tensorflow:
Users that are interested in transformer-tensorflow are comparing it to the libraries listed below
- A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need☆713Updated 3 years ago
- Deep Reinforcement Learning For Sequence to Sequence Models☆766Updated last year
- Transformer implemented in Keras☆372Updated 3 years ago
- TensorFlow implementation of 'Attention Is All You Need (2017. 6)'☆348Updated 6 years ago
- Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch☆695Updated 4 years ago
- Sequence-to-Sequence learning using PyTorch☆522Updated 5 years ago
- Keras library for building (Universal) Transformers, facilitating BERT and GPT models☆536Updated 4 years ago
- Transformer training code for sequential tasks☆610Updated 3 years ago
- ☆3,635Updated 2 years ago
- Visualizing RNNs using the attention mechanism☆749Updated 5 years ago
- Sequence to Sequence Models with PyTorch☆738Updated 2 years ago
- An LSTM in PyTorch with best practices (weight dropout, forget bias, etc.) built-in. Fully compatible with PyTorch LSTM.☆132Updated 5 years ago
- Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with Ten…☆353Updated last year
- Text autoencoder with LSTMs☆262Updated 5 years ago
- Implementation of Universal Transformer in Pytorch☆259Updated 6 years ago
- Simple transformer implementation from scratch in pytorch.☆1,076Updated 9 months ago
- Single Headed Attention RNN - "Stop thinking with your head"☆1,180Updated 3 years ago
- Sequence-to-sequence model with LSTM encoder/decoders and attention☆1,269Updated 4 years ago
- A TensorFlow Implementation of the Transformer: Attention Is All You Need☆4,328Updated last year
- Code for the paper "Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks"☆577Updated 5 years ago
- Pytorch implementation of R-Transformer. Some parts of the code are adapted from the implementation of TCN and Transformer.☆227Updated 5 years ago
- A repository containing tutorials for practical NLP using PyTorch☆533Updated 5 years ago
- Transformer seq2seq model, program that can build a language translator from parallel corpus☆1,373Updated last year
- LSTM and QRNN Language Model Toolkit for PyTorch☆1,965Updated 3 years ago
- Reformer, the efficient Transformer, in Pytorch☆2,152Updated last year
- Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"☆1,550Updated 4 years ago
- PyTorch implementation of batched bi-RNN encoder and attention-decoder.☆279Updated 6 years ago
- An open source framework for seq2seq models in PyTorch.☆1,506Updated 2 years ago
- Minimal tutorial on packing and unpacking sequences in pytorch☆210Updated 6 years ago
- Attention-based sequence to sequence learning☆390Updated 5 years ago