monk1337 / Various-Attention-mechanisms
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
☆123Updated 2 years ago
Related projects: ⓘ
- A pytorch implementation of Fairseq Convolutional Sequence to Sequence Learning(Gehring et al. 2017)☆43Updated 5 years ago
- document classification using LSTM + self attention☆112Updated 4 years ago
- Text classification based on LSTM on R8 dataset for pytorch implementation☆140Updated 7 years ago
- Minimal RNN classifier with self-attention in Pytorch☆148Updated 2 years ago
- PyTorch implementation of batched bi-RNN encoder and attention-decoder.☆278Updated 5 years ago
- Multi heads attention for image classification☆80Updated 6 years ago
- Hierarchical Attention Networks for Document Classification in PyTorch☆36Updated 5 years ago
- pytorch neural network attention mechanism☆147Updated 5 years ago
- Implementation of the Transformer architecture described by Vaswani et al. in "Attention Is All You Need"☆28Updated 5 years ago
- Tensorflow Implementation of Densely Connected Bidirectional LSTM with Applications to Sentence Classification☆48Updated 6 years ago
- 自注意力与文本分类☆119Updated 5 years ago
- Keras implement of ON-LSTM (Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks)