minimaxir / aitextgen
A robust Python tool for text-based AI training and generation using GPT-2.
☆1,834Updated last year
Related projects: ⓘ
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,402Updated last year
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆1,145Updated last year
- Text-generation API via GPT-2 for Cloud Run☆313Updated 3 years ago
- Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.☆4,941Updated 2 years ago
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,930Updated 9 months ago
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.☆8,204Updated 2 years ago
- Test prompts for OpenAI's GPT-3 API and the resulting AI-generated texts.☆702Updated 4 years ago
- Model parallel transformers in JAX and Haiku☆6,273Updated last year
- Conditional Transformer Language Model for Controllable Generation☆1,867Updated 2 years ago
- Open clone of OpenAI's unreleased WebText dataset scraper. This version uses pushshift.io files instead of the API for speed.☆708Updated last year
- Python script to download public Tweets from a given Twitter account into a format suitable for AI text generation.☆221Updated 4 years ago
- Large-scale pretraining for dialogue☆2,343Updated last year
- 🦄 State-of-the-Art Conversational AI with Transfer Learning☆1,734Updated last year
- Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.☆516Updated last month
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆428Updated last year
- retrain gpt-2 in colab☆226Updated 5 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆6,819Updated this week
- ☆1,469Updated last year
- Tweet Generation with Huggingface☆406Updated 8 months ago
- Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation☆963Updated 5 years ago
- ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝☆372Updated 3 years ago
- The goal of this project is to enable users to create cool web demos using the newly released OpenAI GPT-3 API with just a few lines of P…☆2,898Updated 11 months ago
- Code for Defending Against Neural Fake News, https://rowanzellers.com/grover/☆914Updated last year
- A practical and feature-rich paraphrasing framework to augment human intents in text form to build robust NLU models for conversational e…☆868Updated 8 months ago
- Large-scale pretrained models for goal-directed dialog☆849Updated 9 months ago
- This repository contains the code for "Exploiting Cloze Questions for Few-Shot Text Classification and Natural Language Inference"☆1,619Updated last year
- Open-AI's DALL-E for large scale training in mesh-tensorflow.☆434Updated 2 years ago
- Neural question generation using transformers☆1,093Updated 5 months ago
- This Word Does Not Exist☆1,020Updated 2 years ago
- Provide an input CSV and a target field to predict, generate a model + code to run it.☆1,845Updated 4 years ago