philipperemy / keras-attentionLinks
Keras Attention Layer (Luong and Bahdanau scores).
☆2,813Updated last year
Alternatives and similar repositories for keras-attention
Users that are interested in keras-attention are comparing it to the libraries listed below
Sorting:
- Attention mechanism for processing sequential data that considers the context for each timestamp.☆656Updated 3 years ago
- Visualizing RNNs using the attention mechanism☆751Updated 6 years ago
- A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need☆713Updated 4 years ago
- Sequence to Sequence Learning with Keras☆3,173Updated 3 years ago
- Keras Temporal Convolutional Network. Supports Python and R.☆1,982Updated 7 months ago
- Keras Layer implementation of Attention for Sequential models☆442Updated 2 years ago
- some attention implements☆1,450Updated 5 years ago
- Keras community contributions☆1,580Updated 3 years ago
- Keras library for building (Universal) Transformers, facilitating BERT and GPT models☆540Updated 5 years ago
- Sequence modeling benchmarks and temporal convolutional networks☆4,401Updated 3 years ago
- A TensorFlow Implementation of the Transformer: Attention Is All You Need☆4,431Updated 2 years ago
- Keras implementation of BERT with pre-trained weights☆815Updated 6 years ago
- Tensorflow implementation of attention mechanism for text classification tasks.☆747Updated 5 years ago
- An open source framework for seq2seq models in PyTorch.☆1,516Updated last month
- Framework for building complex recurrent neural networks with Keras☆767Updated 3 years ago
- Implementation of Sequence Generative Adversarial Nets with Policy Gradient☆2,094Updated 6 years ago
- Transformer implemented in Keras☆369Updated 3 years ago
- Layers Outputs and Gradients in Keras. Made easy.☆1,053Updated 7 months ago
- Text classifier for Hierarchical Attention Networks for Document Classification☆1,080Updated 4 years ago
- LSTM and QRNN Language Model Toolkit for PyTorch☆1,984Updated 3 years ago
- Signal forecasting with a Sequence-to-Sequence (seq2seq) Recurrent Neural Network (RNN) model in TensorFlow - Guillaume Chevalier☆1,085Updated 2 years ago
- Dynamic seq2seq in TensorFlow, step by step☆995Updated 8 years ago
- A wrapper layer for stacking layers horizontally☆228Updated 3 years ago
- Implementation of BERT that could load official pre-trained models for feature extraction and prediction☆2,426Updated 3 years ago
- Codebase for the paper LSTM Fully Convolutional Networks for Time Series Classification☆800Updated 6 years ago
- attention-based LSTM/Dense implemented by Keras☆300Updated 7 years ago
- Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch☆703Updated 4 years ago
- ☆3,677Updated 3 years ago
- Visualization Toolbox for Long Short Term Memory networks (LSTMs)☆1,259Updated 3 years ago
- Text autoencoder with LSTMs☆262Updated 6 years ago