ictnlp / awesome-transformerLinks
A collection of transformer's guides, implementations and variants.
☆104Updated 5 years ago
Alternatives and similar repositories for awesome-transformer
Users that are interested in awesome-transformer are comparing it to the libraries listed below
Sorting:
- ICLR2019, Multilingual Neural Machine Translation with Knowledge Distillation☆70Updated 4 years ago
- This project attempts to maintain the SOTA performance in machine translation☆108Updated 4 years ago
- Worth-reading papers and related resources on attention mechanism, Transformer and pretrained language model (PLM) such as BERT. 值得一读的注意力…☆133Updated 4 years ago
- Pytorch implementation of Bert and Pals: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning (https://arxiv.org/ab…☆82Updated 6 years ago
- A simple module consistently outperforms self-attention and Transformer model on main NMT datasets with SoTA performance.☆85Updated last year
- Data and code used in our NAACL'19 paper "Selective Attention for Context-aware Neural Machine Translation"☆30Updated 5 years ago
- PyTorch Implementation of "Non-Autoregressive Neural Machine Translation"☆269Updated 3 years ago
- Transformers without Tears: Improving the Normalization of Self-Attention