Norod / TrainGPT2-127M-FromScratch

A trio of Google-Colab notebooks (ipynb) for training a GPT-2 (127M) model from scratch (useful for other / non-English languages) using gpt-2-simple
15Updated 4 years ago

Related projects: