A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
☆148Aug 20, 2018Updated 7 years ago
Alternatives and similar repositories for Linear-Attention-Recurrent-Neural-Network
Users that are interested in Linear-Attention-Recurrent-Neural-Network are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- PTB Language Modelling task with LSTM + Attention layer☆31Feb 27, 2018Updated 8 years ago
- Tensors and Dynamic neural networks in Python with strong GPU acceleration