lizhenping / multi-head-self-attention

在sts数据集上用多头注意力机制上进行测试。 pytorch torchtext 代码简练,非常适合新手了解多头注意力机制的运作。不想transformer牵扯很多层 multi-head attention + one layer linear
16Updated 3 months ago

Alternatives and similar repositories for multi-head-self-attention:

Users that are interested in multi-head-self-attention are comparing it to the libraries listed below