monk1337 / Various-Attention-mechanismsLinks
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
☆126Updated 3 years ago
Alternatives and similar repositories for Various-Attention-mechanisms
Users that are interested in Various-Attention-mechanisms are comparing it to the libraries listed below
Sorting:
- A pytorch implementation of Fairseq Convolutional Sequence to Sequence Learning(Gehring et al. 2017)☆46Updated 6 years ago
- document classification using LSTM + self attention☆112Updated 5 years ago
- Implementation of the Transformer architecture described by Vaswani et al. in "Attention Is All You Need"☆28Updated 6 years ago
- PyTorch implementation of batched bi-RNN encoder and attention-decoder.☆280Updated 6 years ago
- Minimal RNN classifier with self-attention in Pytorch☆150Updated 3 years ago
- Keras implementation of “Gated Linear Unit ”☆23Updated last year
- NLSTM Nested LSTM in Pytorch☆17Updated 7 years ago
- Sequence to Sequence and attention from scratch using Tensorflow☆29Updated 7 years ago
- Semi Supervised Learning for Text-Classification☆83Updated 6 years ago
- Tensorflow Implementation of Densely Connected Bidirectional LSTM with Applications to Sentence Classification☆47Updated 7 years ago
- Sequence to Sequence Models in PyTorch☆44Updated 10 months ago
- Pytorch implementation of R-Transformer. Some parts of the code are adapted from the implementation of TCN and Transformer.☆230Updated 5 years ago
- Keras implement of ON-LSTM (Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks)☆156Updated 5 years ago
- Beam search for neural network sequence to sequence (encoder-decoder) models.☆34Updated 6 years ago
- Efficient Transformers for research, PyTorch and Tensorflow using Locality Sensitive Hashing☆95Updated 5 years ago
- Position embedding layers in Keras☆58Updated 3 years ago
- Multi heads attention for image classification☆80Updated 7 years ago
- Reproducing Character-Level-Language-Modeling with Deeper Self-Attention in PyTorch☆61Updated 6 years ago
- Text classification based on LSTM on R8 dataset for pytorch implementation☆141Updated 7 years ago
- An Implementation of Encoder-Decoder model with global attention mechanism.☆32Updated 5 years ago
- pytorch implementation of Attention is all you need☆238Updated 4 years ago
- Highway Networks implement in pytorch☆71Updated 2 years ago
- ☆76Updated 5 years ago
- An LSTM in PyTorch with best practices (weight dropout, forget bias, etc.) built-in. Fully compatible with PyTorch LSTM.☆133Updated 5 years ago
- pytorch neural network attention mechanism☆147Updated 6 years ago
- Sequence to Sequence Models with PyTorch☆26Updated 7 years ago
- Collection of custom layers and utility functions for Keras which are missing in the main framework.☆62Updated 5 years ago
- Tensorflow Implementation of Variational Attention for Sequence to Sequence Models (COLING 2018)☆70Updated 4 years ago
- Conditional Sequence Generative Adversarial Network trained with policy gradient, Implementation in Tensorflow☆49Updated 6 years ago
- A quick walk-through of the innards of LSTMs and a naive implementation of the Mogrifier LSTM paper in PyTorch☆77Updated 4 years ago