openai / gpt-2Links
Code for the paper "Language Models are Unsupervised Multitask Learners"
☆23,978Updated 11 months ago
Alternatives and similar repositories for gpt-2
Users that are interested in gpt-2 are comparing it to the libraries listed below
Sorting:
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,991Updated last year
- GPT-3: Language Models are Few-Shot Learners☆15,766Updated 4 years ago
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,407Updated 2 years ago
- Unsupervised text tokenizer for Neural Network-based text generation.☆11,111Updated 2 weeks ago
- Code and model for the paper "Improving Language Understanding by Generative Pre-Training"☆2,221Updated 6 years ago
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.☆16,348Updated 2 years ago
- A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Auto…☆15,208Updated this week
- Ongoing research training transformer models at scale☆13,010Updated this week
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆1,147Updated 2 years ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,397Updated 3 months ago
- An implementation of training for GPT2, supports TPUs☆1,424Updated 2 years ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆39,529Updated this week
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training