JRC1995 / INTER-INTRA-attentionsLinks

An experimental custom seq-2-seq model with both layer-wise (inter-layer), and intra-layer attention (attention to previous hidden states of the same RNN unit) for abstractive summarization.
10Updated 7 years ago

Alternatives and similar repositories for INTER-INTRA-attentions

Users that are interested in INTER-INTRA-attentions are comparing it to the libraries listed below

Sorting: