openai / gpt-2-output-dataset
Dataset of GPT-2 outputs for research in detection, biases, and more
☆1,943Updated 11 months ago
Alternatives and similar repositories for gpt-2-output-dataset:
Users that are interested in gpt-2-output-dataset are comparing it to the libraries listed below
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,398Updated last year
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆22,582Updated 3 months ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆1,147Updated 2 years ago
- Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation☆975Updated 5 years ago
- Code and model for the paper "Improving Language Understanding by Generative Pre-Training"☆2,162Updated 5 years ago
- Open clone of OpenAI's unreleased WebText dataset scraper. This version uses pushshift.io files instead of the API for speed.☆717Updated last year
- An implementation of training for GPT2, supports TPUs☆1,423Updated last year
- 🦄 State-of-the-Art Conversational AI with Transfer Learning☆1,742Updated last year
- Conditional Transformer Language Model for Controllable Generation☆1,873Updated 3 years ago
- jiant is an nlp toolkit☆1,648Updated last year
- 🐥A PyTorch implementation of OpenAI's finetuned transformer language model with a script to import the weights pre-trained by OpenAI☆1,512Updated 3 years ago
- An open clone of the GPT-2 WebText dataset by OpenAI. Still WIP.☆385Updated 8 months ago
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.☆8,237Updated 2 years ago
- A robust Python tool for text-based AI training and generation using GPT-2.☆1,843Updated last year
- Code for Defending Against Neural Fake News, https://rowanzellers.com/grover/☆917Updated last year
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆6,966Updated this week
- Plug and Play Language Model implementation. Allows to steer topic and attributes of GPT-2 models.☆1,132Updated 9 months ago
- ☆1,506Updated last year
- Model parallel transformers in JAX and Haiku☆6,306Updated last year
- Large-scale pretraining for dialogue☆2,362Updated 2 years ago
- ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators☆2,340Updated 8 months ago
- ☆1,251Updated last year
- ☆2,699Updated last week
- Crawl BookCorpus☆813Updated last year
- XLNet: Generalized Autoregressive Pretraining for Language Understanding☆6,182Updated last year
- Repo for external large-scale work☆6,518Updated 7 months ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,197Updated 2 months ago
- Making text a first-class citizen in TensorFlow.☆1,233Updated last week
- ALBERT: A Lite BERT for Self-supervised Learning of Language Representations☆3,248Updated last year
- Code for "Learning to summarize from human feedback"☆994Updated last year