lizhenping / multi-head-self-attention

在sts数据集上用多头注意力机制上进行测试。 pytorch torchtext 代码简练,非常适合新手了解多头注意力机制的运作。不想transformer牵扯很多层 multi-head attention + one layer linear
14Updated last month

Related projects

Alternatives and complementary repositories for multi-head-self-attention