guillaume-chevalier / Linear-Attention-Recurrent-Neural-Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
☆145Updated 6 years ago
Alternatives and similar repositories for Linear-Attention-Recurrent-Neural-Network
Users that are interested in Linear-Attention-Recurrent-Neural-Network are comparing it to the libraries listed below
Sorting:
- Dilated RNNs in pytorch☆212Updated 5 years ago
- Tensorflow implementation of a Hierarchical and Multiscale RNN, described in https://arxiv.org/abs/1609.01704☆135Updated 7 years ago
- Keras implementation of Phased LSTM [https://arxiv.org/abs/1610.09513]☆144Updated 5 years ago
- pytorch implementation of Independently Recurrent Neural Networks https://arxiv.org/abs/1803.04831☆121Updated 6 years ago
- Keras implementation of Nested LSTMs☆88Updated 6 years ago
- Keras implementation of Attention Augmented Convolutional Neural Networks☆121Updated 5 years ago
- Implementation of IndRNN in Keras☆67Updated 4 years ago
- Phased LSTM: Accelerating Recurrent Network Training for Long or Event-based Sequences (NIPS 2016) - Tensorflow 1.0☆128Updated 6 years ago
- Miltiplicative LSTM for Keras 2.0+☆42Updated 7 years ago
- Tensorflow implementation for DilatedRNN☆349Updated 7 years ago
- [ICLR'19] Trellis Networks for Sequence Modeling☆472Updated 5 years ago
- STCN: Stochastic Temporal Convolutional Networks☆69Updated 4 years ago
- Repository for the ablation study of "Long Short-Term Memory Fully Convolutional Networks for Time Series Classification"☆54Updated 6 years ago
- Codebase for the paper LSTM Fully Convolutional Networks for Time Series Classification☆141Updated 6 years ago
- This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention…☆125Updated 3 years ago
- TensorFlow LSTM-autoencoder implementation☆191Updated 7 years ago
- ☆11Updated 7 years ago
- Keras implementation of LSTM Variational Autoencoder☆227Updated 5 years ago
- PyTorch implementations of LSTM Variants (Dropout + Layer Norm)☆136Updated 4 years ago
- ☆24Updated 4 years ago
- ConvLSTMCell for TensorFlow☆59Updated 7 years ago
- SRU implement in pytorch(Training RNNs as Fast as CNNs)☆46Updated 2 years ago
- NLSTM Nested LSTM in Pytorch☆18Updated 7 years ago
- PyTorch implementation of batched bi-RNN encoder and attention-decoder.☆279Updated 6 years ago
- Using deep stacked residual bidirectional LSTM cells (RNN) with TensorFlow, we do Human Activity Recognition (HAR). Classifying the type …☆318Updated 2 years ago
- Variational Autoencoder with Recurrent Neural Network based on Google DeepMind's "DRAW: A Recurrent Neural Network For Image Generation"☆39Updated 8 years ago
- Visualizing RNNs using the attention mechanism☆750Updated 5 years ago
- Tensorflow Implementation of GAN modeling for sequential data☆69Updated 7 years ago
- Pytorch implementation of a basic language model using Attention in LSTM network☆26Updated 6 years ago
- Dual Staged Attention Model for Time Series prediction☆66Updated 7 years ago