CyberZHG / keras-self-attention

Attention mechanism for processing sequential data that considers the context for each timestamp.
656Updated 2 years ago

Related projects

Alternatives and complementary repositories for keras-self-attention