ap229997 / LanguageModel-using-AttentionLinks
Pytorch implementation of a basic language model using Attention in LSTM network
☆27Updated 7 years ago
Alternatives and similar repositories for LanguageModel-using-Attention
Users that are interested in LanguageModel-using-Attention are comparing it to the libraries listed below
Sorting:
- PyTorch implementation of batched bi-RNN encoder and attention-decoder.☆281Updated 6 years ago
- Word Embedding + LSTM + FC☆160Updated last year
- Implementation of Hierarchical Attention Networks in PyTorch☆129Updated 7 years ago
- Text classification based on LSTM on R8 dataset for pytorch implementation☆141Updated 8 years ago
- LSTM and CNN sentiment analysis☆171Updated 7 years ago
- Minimal RNN classifier with self-attention in Pytorch☆152Updated 4 years ago
- This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention…☆127Updated 4 years ago
- Pytorch implementation of R-Transformer. Some parts of the code are adapted from the implementation of TCN and Transformer.☆231Updated 6 years ago
- A Structured Self-attentive Sentence Embedding☆494Updated 6 years ago
- Pytorch implementation for NIPS2017 paper `Dynamic Routing Between Capsules`☆159Updated 8 years ago
- pytorch neural network attention mechanism☆148Updated 6 years ago
- LSTM Classification using Pytorch☆77Updated 6 years ago
- document classification using LSTM + self attention☆114Updated 6 years ago
- All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.☆234Updated 5 years ago
- A pytorch implementation of Fairseq Convolutional Sequence to Sequence Learning(Gehring et al. 2017)☆45Updated 7 years ago
- Hierarchical Attention Networks for Document Classification in PyTorch☆36Updated 7 years ago
- pytorch implementation of Independently Recurrent Neural Networks https://arxiv.org/abs/1803.04831☆121Updated 7 years ago
- Repository for Attention Algorithm☆41Updated 7 years ago
- pytorch implementation of Attention is all you need☆239Updated 4 years ago
- Text classification models: cnn, self-attention, cnn-rnf, rnn-att, capsule-net. TensorFlow. Single GPU or multi GPU☆19Updated 5 years ago
- A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head atte…☆146Updated 7 years ago
- DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understanding☆26Updated 7 years ago
- extract features by maximizing mutual information☆148Updated 6 years ago
- ECML 2019: Graph Neural Networks for Multi-Label Classification☆91Updated last year
- NLSTM Nested LSTM in Pytorch☆17Updated 7 years ago
- Multi heads attention for image classification☆80Updated 7 years ago
- Hierarchical Attention Networks for Document Classification in PyTorch☆608Updated 5 years ago
- A PyTorch implementation of : Language Modeling with Gated Convolutional Networks.☆103Updated 4 years ago
- Independently Recurrent Neural Networks (IndRNN) implemented in pytorch.☆138Updated 5 years ago
- A pytorch implementation of the paper: "Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks"☆83Updated 7 years ago