lizhenping / multi-head-self-attention
View external linksLinks

在sts数据集上用多头注意力机制上进行测试。 pytorch torchtext 代码简练,非常适合新手了解多头注意力机制的运作。不想transformer牵扯很多层 multi-head attention + one layer linear
18Aug 20, 2025Updated 5 months ago

Alternatives and similar repositories for multi-head-self-attention

Users that are interested in multi-head-self-attention are comparing it to the libraries listed below

Sorting:

Are these results useful?