AliHaiderAhmad001 / BERT-from-Scratch-with-PyTorchLinks
Implementation of BERT-based Language Models
☆24Updated last year
Alternatives and similar repositories for BERT-from-Scratch-with-PyTorch
Users that are interested in BERT-from-Scratch-with-PyTorch are comparing it to the libraries listed below
Sorting:
- Tutorial for how to build BERT from scratch☆100Updated last year
- Contains the public resources of Hands on GenAI book☆218Updated 11 months ago
- LLaMA 2 implemented from scratch in PyTorch☆361Updated 2 years ago
- Accelerate Model Training with PyTorch 2.X, published by Packt☆48Updated 3 weeks ago
- LORA: Low-Rank Adaptation of Large Language Models implemented using PyTorch☆117Updated 2 years ago
- Llama from scratch, or How to implement a paper without crying☆581Updated last year
- ☆81Updated last year
- Well documented, unit tested, type checked and formatted implementation of a vanilla transformer - for educational purposes.☆271Updated last year
- Attention is all you need implementation☆1,111Updated last year
- Getting Started with PyTorch Lightning, Published by Packt☆161Updated 3 weeks ago
- Recreating PyTorch from scratch (C/C++, CUDA, NCCL and Python, with multi-GPU support and automatic differentiation!)☆161Updated last week
- Learn Generative AI with PyTorch (Manning Publications, 2024)☆130Updated 6 months ago
- Notes about "Attention is all you need" video (https://www.youtube.com/watch?v=bCz4OMemCcA)☆329Updated 2 years ago
- Notes on quantization in neural networks☆111Updated last year
- Inside Deep Learning: The math, the algorithms, the models☆269Updated 2 years ago
- Distributed training (multi-node) of a Transformer model☆88Updated last year
- PDFs and Codelabs for the Efficient Deep Learning book.☆203Updated 2 years ago
- ☆188Updated last year
- ☆99Updated last year
- Transformers 3rd Edition☆459Updated 3 months ago
- ☆162Updated last year
- Website☆57Updated 2 years ago
- An extension of the nanoGPT repository for training small MOE models.☆215Updated 8 months ago
- Mastering Transformers, published by Packt☆358Updated 3 weeks ago
- Research projects built on top of Transformers☆103Updated 8 months ago
- ☆146Updated last year
- Notes and commented code for RLHF (PPO)☆118Updated last year
- LLaMA 3 is one of the most promising open-source model after Mistral, we will recreate it's architecture in a simpler manner.☆191Updated last year
- 🧠 A study guide to learn about Transformers☆12Updated last year
- ☆156Updated last year