guillaume-chevalier / Linear-Attention-Recurrent-Neural-NetworkLinks
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
☆145Updated 6 years ago
Alternatives and similar repositories for Linear-Attention-Recurrent-Neural-Network
Users that are interested in Linear-Attention-Recurrent-Neural-Network are comparing it to the libraries listed below
Sorting:
- Dilated RNNs in pytorch☆213Updated 6 years ago
- Implementation of IndRNN in Keras☆67Updated 5 years ago
- Repository for the ablation study of "Long Short-Term Memory Fully Convolutional Networks for Time Series Classification"☆54Updated 6 years ago
- pytorch implementation of Independently Recurrent Neural Networks https://arxiv.org/abs/1803.04831☆121Updated 6 years ago
- This repository contains the source for the paper "S-LSTM-GAN: Shared recurrent neural networks with adversarial training"☆88Updated 6 years ago
- Keras implementation of Nested LSTMs☆88Updated 6 years ago
- Codebase for the paper LSTM Fully Convolutional Networks for Time Series Classification☆141Updated 6 years ago
- Collection of custom layers and utility functions for Keras which are missing in the main framework.☆62Updated 5 years ago
- PyTorch implementations of LSTM Variants (Dropout + Layer Norm)☆137Updated 4 years ago
- ☆11Updated 7 years ago
- [ICLR'19] Trellis Networks for Sequence Modeling☆471Updated 5 years ago
- Deep Neural Network Ensembles for Time Series Classification☆111Updated last year
- ☆40Updated 7 years ago
- A tensorflow implementation of GAN ( exactly InfoGAN or Info GAN ) to one dimensional ( 1D ) time series data.☆299Updated last year
- Phased LSTM: Accelerating Recurrent Network Training for Long or Event-based Sequences (NIPS 2016) - Tensorflow 1.0☆128Updated 6 years ago
- ☆64Updated 2 years ago
- Keras implementation of LSTM Variational Autoencoder☆227Updated 5 years ago
- Deep Embedding Clustering in Keras☆132Updated 8 years ago
- Variational Autoencoder with Recurrent Neural Network based on Google DeepMind's "DRAW: A Recurrent Neural Network For Image Generation"☆39Updated 8 years ago
- Tensorflow implementation for DilatedRNN☆351Updated 7 years ago
- Keras implementation of Attention Augmented Convolutional Neural Networks☆121Updated 5 years ago
- TensorFlow LSTM-autoencoder implementation☆191Updated 7 years ago
- Keras implementation of Phased LSTM [https://arxiv.org/abs/1610.09513]☆144Updated 5 years ago
- STCN: Stochastic Temporal Convolutional Networks☆69Updated 5 years ago
- Tensorflow implementation of a Hierarchical and Multiscale RNN, described in https://arxiv.org/abs/1609.01704☆135Updated 7 years ago
- This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention…☆125Updated 3 years ago
- A simple Tensorflow based library for deep and/or denoising AutoEncoder.☆149Updated 7 years ago
- Miltiplicative LSTM for Keras 2.0+☆42Updated 7 years ago
- Multi dimensional LSTM as described in Alex Graves' Paper https://arxiv.org/pdf/0705.2011.pdf☆156Updated 5 years ago
- pytorch neural network attention mechanism☆147Updated 6 years ago