harvardnlp / annotated-transformerLinks
An annotated implementation of the Transformer paper.
☆6,296Updated last year
Alternatives and similar repositories for annotated-transformer
Users that are interested in annotated-transformer are comparing it to the libraries listed below
Sorting:
- A PyTorch implementation of the Transformer model in "Attention is All You Need".☆9,235Updated last year
- Google AI 2018 BERT pytorch implementation☆6,416Updated last year
- Transformer: PyTorch Implementation of "Attention Is All You Need"☆3,782Updated 10 months ago
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,385Updated 2 weeks ago
- A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch☆8,679Updated last month
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Py…☆23,087Updated 3 months ago
- ☆11,460Updated 3 months ago
- Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.