CyberZHG / keras-self-attentionLinks
Attention mechanism for processing sequential data that considers the context for each timestamp.
☆655Updated 3 years ago
Alternatives and similar repositories for keras-self-attention
Users that are interested in keras-self-attention are comparing it to the libraries listed below
Sorting:
- A wrapper layer for stacking layers horizontally☆228Updated 3 years ago
- Transformer implemented in Keras☆371Updated 3 years ago
- Keras Layer implementation of Attention for Sequential models☆441Updated 2 years ago
- A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need☆712Updated 3 years ago
- Keras library for building (Universal) Transformers, facilitating BERT and GPT models☆537Updated 5 years ago
- Visualizing RNNs using the attention mechanism☆750Updated 5 years ago
- Keras implementation of BERT with pre-trained weights☆814Updated 5 years ago
- An example attention network with simple dataset.☆230Updated 6 years ago
- Keras Attention Layer (Luong and Bahdanau scores).☆2,809Updated last year
- attention-based LSTM/Dense implemented by Keras☆299Updated 7 years ago
- Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with Ten…☆359Updated last year
- A simple technique to integrate BERT from tf hub to keras☆258Updated 2 years ago
- Named-Entity-Recognition-with-Bidirectional-LSTM-CNNs☆365Updated 5 years ago
- Binary and Categorical Focal loss implementation in Keras.☆279Updated 5 months ago
- Hierarchical Attention Networks for Document Classification in PyTorch☆604Updated 5 years ago
- Implementation of papers for text classification task on DBpedia☆737Updated 4 years ago
- Re-implementation of ELMo on Keras☆133Updated 2 years ago
- Layer normalization implemented in Keras☆60Updated 3 years ago
- How to use ELMo embeddings in Keras with Tensorflow Hub☆259Updated 6 years ago
- 自注意力与文本分类☆119Updated 6 years ago
- RAdam implemented in Keras & TensorFlow☆325Updated 3 years ago
- Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch☆699Updated 4 years ago
- some attention implements☆1,445Updated 5 years ago
- Tensorflow implementation of attention mechanism for text classification tasks.☆748Updated 5 years ago
- This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention…☆126Updated 3 years ago
- Document classification with Hierarchical Attention Networks in TensorFlow. WARNING: project is currently unmaintained, issues will proba…☆465Updated last year
- Framework for building complex recurrent neural networks with Keras☆765Updated 2 years ago
- BiLSTM-CNN-CRF architecture for sequence tagging☆830Updated 4 years ago
- An implementation of a sequence to sequence neural network using an encoder-decoder☆209Updated 5 years ago
- Focal Loss implementation in Keras☆326Updated 4 years ago