datnnt1997 / multi-head_self-attentionLinks

A Faster Pytorch Implementation of Multi-Head Self-Attention
74Updated 3 years ago

Alternatives and similar repositories for multi-head_self-attention

Users that are interested in multi-head_self-attention are comparing it to the libraries listed below

Sorting: