minimaxir / aitextgen
A robust Python tool for text-based AI training and generation using GPT-2.
☆1,844Updated last year
Alternatives and similar repositories for aitextgen:
Users that are interested in aitextgen are comparing it to the libraries listed below
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,399Updated 2 years ago
- Text-generation API via GPT-2 for Cloud Run☆316Updated 3 years ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆1,148Updated 2 years ago
- Test prompts for OpenAI's GPT-3 API and the resulting AI-generated texts.☆701Updated 4 years ago
- ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝☆372Updated 3 years ago
- Model parallel transformers in JAX and Haiku☆6,315Updated last year
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.☆8,260Updated 2 years ago
- Large-scale pretraining for dialogue☆2,365Updated 2 years ago
- Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.☆4,937Updated 2 years ago
- Conditional Transformer Language Model for Controllable Generation☆1,874Updated 3 years ago
- ☆2,048Updated 2 years ago
- 🦄 State-of-the-Art Conversational AI with Transfer Learning☆1,744Updated last year
- Python script to download public Tweets from a given Twitter account into a format suitable for AI text generation.☆223Updated 4 years ago
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,952Updated last year
- Open clone of OpenAI's unreleased WebText dataset scraper. This version uses pushshift.io files instead of the API for speed.☆724Updated 2 years ago
- Tweet Generation with Huggingface☆415Updated last year
- Plug and Play Language Model implementation. Allows to steer topic and attributes of GPT-2 models.☆1,135Updated 10 months ago
- Method to encode text for GPT-2 to generate text based on provided keywords☆260Updated 3 years ago
- Open-AI's DALL-E for large scale training in mesh-tensorflow.☆433Updated 2 years ago
- Code for Defending Against Neural Fake News, https://rowanzellers.com/grover/☆918Updated last year
- Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.☆528Updated 5 months ago
- The goal of this project is to enable users to create cool web demos using the newly released OpenAI GPT-3 API with just a few lines of P…☆2,897Updated last year
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,042Updated this week
- Generate images from texts. In Russian☆1,644Updated 2 years ago
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆436Updated last year
- Multi-angle c(q)uestion answering☆458Updated 2 years ago
- ☆1,523Updated last year
- A practical and feature-rich paraphrasing framework to augment human intents in text form to build robust NLU models for conversational e…☆882Updated last year