lizhenping / multi-head-self-attentionView on GitHub
在sts数据集上用多头注意力机制上进行测试。 pytorch torchtext 代码简练,非常适合新手了解多头注意力机制的运作。不想transformer牵扯很多层 multi-head attention + one layer linear
18Aug 20, 2025Updated 7 months ago

Alternatives and similar repositories for multi-head-self-attention

Users that are interested in multi-head-self-attention are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?