pbloem / formerLinks
Simple transformer implementation from scratch in pytorch. (archival, latest version on codeberg)
☆1,094Updated 10 months ago
Alternatives and similar repositories for former
Users that are interested in former are comparing it to the libraries listed below
Sorting:
- My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing o…☆1,082Updated 5 years ago
- A collection of resources to study Transformers in depth.☆560Updated 2 years ago
- Pytorch library for fast transformer implementations☆1,759Updated 2 years ago
- PyTorch implementation of some attentions for Deep Learning Researchers.☆547Updated 3 years ago
- ☆828Updated 9 months ago
- Fast, general, and tested differentiable structured prediction in PyTorch☆1,123Updated 3 years ago
- PyTorch tutorials and best practices.☆1,710Updated 10 months ago
- FrancescoSaverioZuppichini / Pytorch-how-and-when-to-use-Module-Sequential-ModuleList-and-ModuleDictCode for my medium article☆375Updated 5 years ago
- An unofficial styleguide and best practices summary for PyTorch☆2,008Updated 4 years ago
- Course notes☆739Updated last year
- Long Range Arena for Benchmarking Efficient Transformers☆775Updated 2 years ago
- Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2023☆3,072Updated 2 months ago
- Reformer, the efficient Transformer, in Pytorch☆2,192Updated 2 years ago
- Hopfield Networks is All You Need☆1,896Updated 2 years ago
- The Hitchiker's Guide to PyTorch☆1,199Updated 4 years ago
- Pytorch Lightning code guideline for conferences☆1,287Updated 2 years ago
- An implementation of Performer, a linear attention-based transformer, in Pytorch☆1,173Updated 3 years ago
- A walkthrough of transformer architecture code☆370Updated last year
- A repository containing tutorials for practical NLP using PyTorch☆538Updated 6 years ago
- A learning rate range test implementation in PyTorch☆997Updated 7 months ago
- Attention Is All You Need | a PyTorch Tutorial to Transformers☆363Updated last year
- Well documented, unit tested, type checked and formatted implementation of a vanilla transformer - for educational purposes.☆278Updated last year
- Toolbox of models, callbacks, and datasets for AI/ML researchers.☆1,754Updated last week
- Papers & presentation materials from Hugging Face's internal science day☆2,053Updated 5 years ago
- PyTorch 101 series covering everything from the basic building blocks all the way to building custom architectures.☆265Updated 5 years ago
- The goal of this library is to generate more helpful exception messages for matrix algebra expressions for numpy, pytorch, jax, tensorflo…☆811Updated 3 years ago
- FastFormers - highly efficient transformer models for NLU☆709Updated 10 months ago
- Longformer: The Long-Document Transformer☆2,181Updated 2 years ago
- Original transformer paper: Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information process…☆242Updated last year
- A Visual Analysis Tool to Explore Learned Representations in Transformers Models☆603Updated last year