Norod / TrainGPT2-127M-FromScratch

A trio of Google-Colab notebooks (ipynb) for training a GPT-2 (127M) model from scratch (useful for other / non-English languages) using gpt-2-simple
15Updated 4 years ago

Alternatives and similar repositories for TrainGPT2-127M-FromScratch:

Users that are interested in TrainGPT2-127M-FromScratch are comparing it to the libraries listed below