jsbaan / transformer-from-scratchLinks
Well documented, unit tested, type checked and formatted implementation of a vanilla transformer - for educational purposes.
☆254Updated last year
Alternatives and similar repositories for transformer-from-scratch
Users that are interested in transformer-from-scratch are comparing it to the libraries listed below
Sorting:
- Llama from scratch, or How to implement a paper without crying☆574Updated last year
- A Simplified PyTorch Implementation of Vision Transformer (ViT)☆196Updated last year
- I will build Transformer from scratch☆78Updated 2 weeks ago
- Annotated version of the Mamba paper☆487Updated last year
- Attention Is All You Need | a PyTorch Tutorial to Transformers☆332Updated last year
- Tutorial for how to build BERT from scratch☆97Updated last year
- LLaMA 2 implemented from scratch in PyTorch☆345Updated last year
- LORA: Low-Rank Adaptation of Large Language Models implemented using PyTorch☆112Updated 2 years ago
- Annotations of the interesting ML papers I read☆245Updated last week
- Recreating PyTorch from scratch (C/C++, CUDA, NCCL and Python, with multi-GPU support and automatic differentiation!)☆151Updated last year
- The Tensor (or Array)☆441Updated 11 months ago
- Notes about "Attention is all you need" video (https://www.youtube.com/watch?v=bCz4OMemCcA)☆299Updated 2 years ago
- Fast bare-bones BPE for modern tokenizer training☆164Updated last month
- The Multilayer Perceptron Language Model☆557Updated 11 months ago
- NeurIPS Large Language Model Efficiency Challenge: 1 LLM + 1GPU + 1Day☆256Updated last year
- ☆181Updated last year
- Code Transformer neural network components piece by piece☆358Updated 2 years ago
- A set of scripts and notebooks on LLM finetunning and dataset creation☆110Updated 10 months ago
- Documented and Unit Tested educational Deep Learning framework with Autograd from scratch.☆117Updated last year
- Puzzles for exploring transformers☆356Updated 2 years ago
- Starter pack for NeurIPS LLM Efficiency Challenge 2023.☆125Updated last year
- MinT: Minimal Transformer Library and Tutorials☆256Updated 3 years ago
- Code implementation from my blog post: https://fkodom.substack.com/p/transformers-from-scratch-in-pytorch☆95Updated 2 years ago
- ☆90Updated 10 months ago
- LLM Workshop by Sourab Mangrulkar☆388Updated last year
- For optimization algorithm research and development.☆524Updated this week
- Training small GPT-2 style models using Kolmogorov-Arnold networks.☆121Updated last year
- An extension of the nanoGPT repository for training small MOE models.☆164Updated 4 months ago
- Deep learning for dummies. All the practical details and useful utilities that go into working with real models.☆809Updated last week
- Distributed training (multi-node) of a Transformer model☆76Updated last year