monk1337 / Various-Attention-mechanismsLinks
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
☆127Updated 4 years ago
Alternatives and similar repositories for Various-Attention-mechanisms
Users that are interested in Various-Attention-mechanisms are comparing it to the libraries listed below
Sorting:
- pytorch neural network attention mechanism☆148Updated 6 years ago
- document classification using LSTM + self attention☆114Updated 6 years ago
- Tensorflow Implementation of Densely Connected Bidirectional LSTM with Applications to Sentence Classification☆47Updated 7 years ago
- A pytorch implementation of Fairseq Convolutional Sequence to Sequence Learning(Gehring et al. 2017)☆45Updated 7 years ago
- Text classification based on LSTM on R8 dataset for pytorch implementation☆141Updated 8 years ago
- NLSTM Nested LSTM in Pytorch☆17Updated 7 years ago
- Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with Ten…☆362Updated last year
- Keras implementation of “Gated Linear Unit ”☆23Updated last year
- The implementation of Meta-LSTM in "Meta Multi-Task Learning for Sequence Modeling." AAAI-18☆33Updated 7 years ago
- PyTorch implementation of batched bi-RNN encoder and attention-decoder.☆281Updated 7 years ago
- Word Embedding + LSTM + FC☆160Updated last year
- TensorFlow Implementation of TCN (Temporal Convolutional Networks)☆111Updated 7 years ago
- Minimal RNN classifier with self-attention in Pytorch☆152Updated 4 years ago
- Sequence to Sequence and attention from scratch using Tensorflow☆29Updated 8 years ago
- Semi Supervised Learning for Text-Classification☆83Updated 6 years ago
- Position embedding layers in Keras☆58Updated 3 years ago
- Pytorch implementation of R-Transformer. Some parts of the code are adapted from the implementation of TCN and Transformer.☆231Updated 6 years ago
- Tensorflow Implementation of Variational Attention for Sequence to Sequence Models (COLING 2018)☆72Updated 5 years ago
- Reproducing Character-Level-Language-Modeling with Deeper Self-Attention in PyTorch☆62Updated 7 years ago
- A bidirectional LSTM with attention for multiclass/multilabel text classification.☆173Updated last year
- Implementation of the Transformer architecture described by Vaswani et al. in "Attention Is All You Need"☆28Updated 6 years ago
- Hierarchical Attention Networks for Document Classification in PyTorch☆36Updated 7 years ago
- Tensorflow implementation of Semi-supervised Sequence Learning (https://arxiv.org/abs/1511.01432)☆82Updated 3 years ago
- An example attention network with simple dataset.☆228Updated 6 years ago
- ☆38Updated 8 years ago
- Multi-Task Learning in NLP☆94Updated 8 years ago
- LSTM and CNN sentiment analysis☆172Updated 7 years ago
- A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head atte…☆146Updated 7 years ago
- Conditional Sequence Generative Adversarial Network trained with policy gradient, Implementation in Tensorflow☆49Updated 7 years ago
- Implement en-fr translation task by implenting seq2seq, encoder-decoder in RNN layers with Attention mechanism and Beamsearch inference d…☆21Updated 7 years ago