dreamgonfly / transformer-pytorchLinks
A PyTorch implementation of Transformer in "Attention is All You Need"
☆106Updated 4 years ago
Alternatives and similar repositories for transformer-pytorch
Users that are interested in transformer-pytorch are comparing it to the libraries listed below
Sorting:
- PyTorch implementation of BERT in "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"☆109Updated 6 years ago
- Transformers without Tears: Improving the Normalization of Self-Attention☆133Updated last year
- PyTorch implementation for Seq2Seq model with attention and Greedy Search / Beam Search for neural machine translation☆58Updated 4 years ago
- Code for the paper "Are Sixteen Heads Really Better than One?"☆172Updated 5 years ago
- A PyTorch implementation of Google AI's BERT model provided with Google's pre-trained models, examples and utilities.☆35Updated 6 years ago
- a pytorch implementation of self-attention with relative position representations☆50Updated 4 years ago
- ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators☆91Updated 4 years ago
- Implementation of RealFormer using pytorch☆101Updated 4 years ago
- Encoding position with the word embeddings.☆83Updated 7 years ago
- Pytorch Implementation of ALBERT(A Lite BERT for Self-supervised Learning of Language Representations)☆227Updated 4 years ago
- Unicoder model for understanding and generation.☆91Updated last year
- ☆219Updated 5 years ago
- A collection of transformer's guides, implementations and variants.☆106Updated 5 years ago
- ICLR2019, Multilingual Neural Machine Translation with Knowledge Distillation☆70Updated 4 years ago
- PyTorch implementation of beam search decoding for seq2seq models☆339Updated 2 years ago
- A PyTorch implementation of the paper - "Synthesizer: Rethinking Self-Attention in Transformer Models"☆73Updated 2 years ago
- Implementation of Mixout with PyTorch☆75Updated 2 years ago
- Multi30k Dataset☆181Updated 3 years ago
- Implements Reformer: The Efficient Transformer in pytorch.☆86Updated 5 years ago
- Visualization for simple attention and Google's multi-head attention.☆68Updated 7 years ago
- For the code release of our arXiv paper "Revisiting Few-sample BERT Fine-tuning" (https://arxiv.org/abs/2006.05987).☆185Updated 2 years ago
- Method to improve inference time for BERT. This is an implementation of the paper titled "PoWER-BERT: Accelerating BERT Inference via Pro…☆62Updated this week
- Repository for the paper "Fast and Accurate Deep Bidirectional Language Representations for Unsupervised Learning"☆110Updated 4 years ago
- Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper☆135Updated 2 years ago
- MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices☆69Updated 5 years ago
- ☆255Updated 2 years ago
- DisCo Transformer for Non-autoregressive MT☆77Updated 3 years ago
- Reproducing Character-Level-Language-Modeling with Deeper Self-Attention in PyTorch☆61Updated 6 years ago
- ☆93Updated 4 years ago
- A PyTorch implementation of Google AI's BERT model provided with Google's pre-trained models, examples and utilities.☆71Updated 3 years ago