warner-benjamin / commented-transformersLinks
Highly commented implementations of Transformers in PyTorch
☆138Updated 2 years ago
Alternatives and similar repositories for commented-transformers
Users that are interested in commented-transformers are comparing it to the libraries listed below
Sorting:
- Minimal example scripts of the Hugging Face Trainer, focused on staying under 150 lines☆196Updated last year
- A miniture AI training framework for PyTorch☆42Updated last year
- ☆102Updated last week
- ☆171Updated last year
- ☆94Updated 2 years ago
- Source notebook code for the course, stripped of all information. Please consider puchasing the course at https://store.walkwithfastai.co…☆36Updated last year
- Gzip and nearest neighbors for text classification☆57Updated 2 years ago
- Notes from the Latent Space paper club. Follow along or start your own!☆243Updated last year
- A set of scripts and notebooks on LLM finetunning and dataset creation☆116Updated last year
- A comprehensive deep dive into the world of tokens☆226Updated last year
- ☆79Updated last year
- Train fastai models faster (and other useful tools)☆72Updated 8 months ago
- Helpers and such for working with Lambda Cloud☆52Updated 2 years ago
- ☆90Updated 2 years ago
- Comprehensive analysis of difference in performance of QLora, Lora, and Full Finetunes.☆83Updated 2 years ago
- ML/DL Math and Method notes☆66Updated 2 years ago
- MinT: Minimal Transformer Library and Tutorials☆260Updated 3 years ago
- ☆215Updated last year
- Fully fine-tune large models like Mistral, Llama-2-13B, or Qwen-14B completely for free☆232Updated last year