An implementation of training for GPT2, supports TPUs
☆1,414Dec 12, 2022Updated 3 years ago
Alternatives and similar repositories for GPT2
Users that are interested in GPT2 are comparing it to the libraries listed below
Sorting:
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆24,648Aug 14, 2024Updated last year
- XLNet: Generalized Autoregressive Pretraining for Language Understanding☆6,176May 28, 2023Updated 2 years ago
- Chinese version of GPT2 training code, using BERT tokenizer.☆7,599Apr 25, 2024Updated last year
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,405Dec 14, 2022Updated 3 years ago
- GPT2 for Multiple Languages, including pretrained models. GPT2 多语言支持, 15亿参数中文预训练模型