openai / gpt-2-output-dataset
Dataset of GPT-2 outputs for research in detection, biases, and more
☆1,967Updated last year
Alternatives and similar repositories for gpt-2-output-dataset:
Users that are interested in gpt-2-output-dataset are comparing it to the libraries listed below
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆1,146Updated 2 years ago
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,401Updated 2 years ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆23,186Updated 7 months ago
- Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation☆987Updated 5 years ago
- Large-scale pretraining for dialogue☆2,380Updated 2 years ago
- Conditional Transformer Language Model for Controllable Generation☆1,878Updated 3 years ago
- ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators☆2,350Updated 11 months ago
- Open clone of OpenAI's unreleased WebText dataset scraper. This version uses pushshift.io files instead of the API for speed.☆729Updated 2 years ago
- Code and model for the paper "Improving Language Understanding by Generative Pre-Training"☆2,192Updated 6 years ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,298Updated 2 weeks ago
- 🦄 State-of-the-Art Conversational AI with Transfer Learning☆1,748Updated last year
- An open clone of the GPT-2 WebText dataset by OpenAI. Still WIP.☆389Updated 11 months ago
- A robust Python tool for text-based AI training and generation using GPT-2.☆1,844Updated last year
- Unsupervised text tokenizer for Neural Network-based text generation.