ap229997 / LanguageModel-using-AttentionLinks
Pytorch implementation of a basic language model using Attention in LSTM network
☆27Updated 7 years ago
Alternatives and similar repositories for LanguageModel-using-Attention
Users that are interested in LanguageModel-using-Attention are comparing it to the libraries listed below
Sorting:
- PyTorch implementation of batched bi-RNN encoder and attention-decoder.☆281Updated 6 years ago
- This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention…☆127Updated 4 years ago
- Word Embedding + LSTM + FC☆160Updated last year
- pytorch implementation of Independently Recurrent Neural Networks https://arxiv.org/abs/1803.04831☆121Updated 6 years ago
- Implementation of Hierarchical Attention Networks in PyTorch☆129Updated 7 years ago
- Minimal RNN classifier with self-attention in Pytorch☆152Updated 3 years ago
- A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head atte…☆146Updated 7 years ago
- pytorch neural network attention mechanism☆148Updated 6 years ago
- A Structured Self-attentive Sentence Embedding☆494Updated 6 years ago
- Text classification based on LSTM on R8 dataset for pytorch implementation☆141Updated 8 years ago
- Pytorch implementation of R-Transformer. Some parts of the code are adapted from the implementation of TCN and Transformer.☆230Updated 6 years ago
- A pytorch implementation of the paper: "Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks"☆83Updated 7 years ago
- Repository for Attention Algorithm☆41Updated 7 years ago
- document classification using LSTM + self attention☆114Updated 6 years ago
- Independently Recurrent Neural Networks (IndRNN) implemented in pytorch.☆138Updated 4 years ago
- Hierarchical Attention Networks for Document Classification in PyTorch☆36Updated 7 years ago
- All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.☆234Updated 5 years ago
- A pytorch implementation of Fairseq Convolutional Sequence to Sequence Learning(Gehring et al. 2017)☆45Updated 6 years ago
- Sequence to Sequence and attention from scratch using Tensorflow☆29Updated 8 years ago
- NLSTM Nested LSTM in Pytorch☆17Updated 7 years ago
- PyTorch implementations of LSTM Variants (Dropout + Layer Norm)☆137Updated 4 years ago
- Multi heads attention for image classification☆80Updated 7 years ago
- Text classification models: cnn, self-attention, cnn-rnf, rnn-att, capsule-net. TensorFlow. Single GPU or multi GPU☆19Updated 5 years ago
- LSTM and CNN sentiment analysis☆171Updated 7 years ago
- LSTM Classification using Pytorch☆77Updated 6 years ago
- Implementation of IndRNN in Keras☆67Updated 5 years ago
- pytorch implementation of Attention is all you need☆239Updated 4 years ago
- LSTM and GRU in PyTorch☆269Updated 6 years ago
- An implementation of DeepMind's Relational Recurrent Neural Networks (NeurIPS 2018) in PyTorch.☆248Updated 6 years ago
- ☆38Updated 8 years ago