CyberZHG / keras-self-attentionLinks
Attention mechanism for processing sequential data that considers the context for each timestamp.
☆659Updated 3 years ago
Alternatives and similar repositories for keras-self-attention
Users that are interested in keras-self-attention are comparing it to the libraries listed below
Sorting:
- A wrapper layer for stacking layers horizontally☆228Updated 3 years ago
- A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need☆712Updated 4 years ago
- Keras Layer implementation of Attention for Sequential models☆443Updated 2 years ago
- Keras library for building (Universal) Transformers, facilitating BERT and GPT models☆541Updated 5 years ago
- Transformer implemented in Keras☆369Updated 3 years ago
- Visualizing RNNs using the attention mechanism☆751Updated 6 years ago
- attention-based LSTM/Dense implemented by Keras☆300Updated 7 years ago
- Keras Attention Layer (Luong and Bahdanau scores).☆2,814Updated 2 years ago
- Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with Ten…☆362Updated last year
- An example attention network with simple dataset.☆228Updated 6 years ago
- Attention-based bidirectional LSTM for Classification Task (ICASSP)☆116Updated 3 years ago
- Keras implementation of BERT with pre-trained weights☆816Updated 6 years ago
- 自注意力与文本分类☆119Updated 7 years ago
- Implementation of papers for text classification task on DBpedia☆737Updated 5 years ago
- A bidirectional LSTM with attention for multiclass/multilabel text classification.☆173Updated last year
- Hierarchical Attention Networks for Document Classification in PyTorch☆608Updated 5 years ago
- Text autoencoder with LSTMs☆262Updated 6 years ago
- An Attention Layer in Keras☆43Updated 6 years ago
- Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs☆369Updated 5 years ago
- Implementation of the Transformer architecture described by Vaswani et al. in "Attention Is All You Need"☆28Updated 6 years ago
- Text classification using different neural networks (CNN, LSTM, Bi-LSTM, C-LSTM).☆210Updated 7 years ago
- document classification using LSTM + self attention☆114Updated 6 years ago
- A simple technique to integrate BERT from tf hub to keras☆258Updated 2 years ago
- Text classifier for Hierarchical Attention Networks for Document Classification☆1,081Updated 4 years ago
- Collection of custom layers and utility functions for Keras which are missing in the main framework.☆62Updated 5 years ago
- An implementation of a sequence to sequence neural network using an encoder-decoder☆210Updated 6 years ago
- Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch☆703Updated 5 years ago
- Tensorflow implementation of attention mechanism for text classification tasks.☆747Updated 6 years ago
- RMDL: Random Multimodel Deep Learning for Classification☆433Updated 2 years ago
- Re-implementation of ELMo on Keras☆135Updated 2 years ago