lzfelix / keras_attentionLinks
An Attention Layer in Keras
☆43Updated 6 years ago
Alternatives and similar repositories for keras_attention
Users that are interested in keras_attention are comparing it to the libraries listed below
Sorting:
- Re-implementation of ELMo on Keras☆133Updated 2 years ago
- Position embedding layers in Keras☆58Updated 3 years ago
- How to use ELMo embeddings in Keras with Tensorflow Hub☆259Updated 6 years ago
- Concatenate word and character embeddings in Keras☆44Updated 3 years ago
- A short tutorial on Elmo training (Pre trained, Training on new data, Incremental training)☆153Updated 5 years ago
- Tensorflow implementation of Semi-supervised Sequence Learning (https://arxiv.org/abs/1511.01432)☆81Updated 2 years ago
- The implementation of text classification using character level convoultion neural networks using Keras☆150Updated 2 years ago
- Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.☆182Updated 2 years ago
- A bidirectional LSTM with attention for multiclass/multilabel text classification.☆173Updated 11 months ago
- TensorFlow implementation of the paper "Hierarchical Attention Networks for Document Classification"☆87Updated 6 years ago
- keras attentional bi-LSTM-CRF for Joint NLU (slot-filling and intent detection) with ATIS☆123Updated 6 years ago
- Transformer-XL with checkpoint loader☆68Updated 3 years ago
- Simple Tensorflow Implementation of "A Structured Self-attentive Sentence Embedding" (ICLR 2017)☆91Updated 7 years ago
- Using pre trained word embeddings (Fasttext, Word2Vec)☆157Updated 7 years ago
- Train and visualize Hierarchical Attention Networks☆202Updated 7 years ago
- CapsNet for NLP☆67Updated 6 years ago
- This is a drop-in Keras layer for ELMo embeddings.☆47Updated 6 years ago
- Implementation of the Transformer architecture described by Vaswani et al. in "Attention Is All You Need"☆28Updated 6 years ago
- Implementation of Hierarchical Attention Networks as presented in https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf