ZhengZixiang / ATPapers

Worth-reading papers and related resources on attention mechanism, Transformer and pretrained language model (PLM) such as BERT. 值得一读的注意力机制、Transformer和预训练语言模型论文与相关资源集合
132Updated 3 years ago

Related projects

Alternatives and complementary repositories for ATPapers