nshepperd / gpt-2Links
Code for the paper "Language Models are Unsupervised Multitask Learners"
☆1,146Updated 3 years ago
Alternatives and similar repositories for gpt-2
Users that are interested in gpt-2 are comparing it to the libraries listed below
Sorting:
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,406Updated 3 years ago
- A robust Python tool for text-based AI training and generation using GPT-2.☆1,843Updated 2 years ago
- Dataset of GPT-2 outputs for research in detection, biases, and more☆2,008Updated 2 years ago
- Code for Defending Against Neural Fake News, https://rowanzellers.com/grover/☆919Updated 2 years ago
- Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation☆1,012Updated 6 years ago
- ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝☆375Updated 4 years ago
- 🦄 State-of-the-Art Conversational AI with Transfer Learning☆1,755Updated 2 years ago
- Text-generation API via GPT-2 for Cloud Run☆313Updated 4 years ago
- Conditional Transformer Language Model for Controllable Generation☆1,885Updated 7 months ago
- OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0☆262Updated 2 years ago
- An open clone of the GPT-2 WebText dataset by OpenAI. Still WIP.☆392Updated last year
- Open clone of OpenAI's unreleased WebText dataset scraper. This version uses pushshift.io files instead of the API for speed.☆749Updated 3 years ago
- Method to encode text for GPT-2 to generate text based on provided keywords☆262Updated 4 years ago
- Plug and Play Language Model implementation. Allows to steer topic and attributes of GPT-2 models.☆1,153Updated last year
- An implementation of training for GPT2, supports TPUs☆1,413Updated 3 years ago
- This repository contains the code for "Exploiting Cloze Questions for Few-Shot Text Classification and Natural Language Inference"☆1,629Updated 2 years ago
- Large-scale pretraining for dialogue☆2,416Updated 3 years ago
- Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.☆4,932Updated 3 years ago
- Open-AI's DALL-E for large scale training in mesh-tensorflow.☆430Updated 3 years ago
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆435Updated 2 years ago
- Large datasets for conversational AI☆1,371Updated 6 years ago
- Crawl BookCorpus☆849Updated 2 years ago
- Test prompts for OpenAI's GPT-3 API and the resulting AI-generated texts.☆700Updated 5 years ago
- Transformer language model (GPT-2) with sentencepiece tokenizer☆164Updated 4 years ago
- ☆2,079Updated 3 years ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆108Updated 4 years ago
- Giant Language Model Test Room☆492Updated last year
- ☆1,648Updated 2 years ago
- Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.☆540Updated last week
- ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators☆2,368Updated last year