datnnt1997 / multi-head_self-attention

A Faster Pytorch Implementation of Multi-Head Self-Attention
71Updated 2 years ago

Alternatives and similar repositories for multi-head_self-attention:

Users that are interested in multi-head_self-attention are comparing it to the libraries listed below