minimaxir / aitextgen
A robust Python tool for text-based AI training and generation using GPT-2.
☆1,841Updated last year
Alternatives and similar repositories for aitextgen:
Users that are interested in aitextgen are comparing it to the libraries listed below
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,399Updated 2 years ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆1,146Updated 2 years ago
- Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.☆4,939Updated 2 years ago
- Text-generation API via GPT-2 for Cloud Run☆315Updated 3 years ago
- Method to encode text for GPT-2 to generate text based on provided keywords☆260Updated 4 years ago
- 🦄 State-of-the-Art Conversational AI with Transfer Learning☆1,749Updated last year
- Conditional Transformer Language Model for Controllable Generation☆1,878Updated 3 years ago
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,967Updated last year
- Code for Defending Against Neural Fake News, https://rowanzellers.com/grover/☆921Updated last year
- ☆1,547Updated last year
- ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝☆371Updated 3 years ago
- Python script to download public Tweets from a given Twitter account into a format suitable for AI text generation.☆225Updated 4 years ago
- Test prompts for OpenAI's GPT-3 API and the resulting AI-generated texts.☆702Updated 4 years ago
- Generate images from texts. In Russian☆1,647Updated 2 years ago
- Model parallel transformers in JAX and Haiku☆6,321Updated 2 years ago
- Open clone of OpenAI's unreleased WebText dataset scraper. This version uses pushshift.io files instead of the API for speed.☆730Updated 2 years ago
- Tweet Generation with Huggingface☆422Updated last year
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆437Updated last year
- Repo for external large-scale work☆6,519Updated 10 months ago
- Large-scale pretraining for dialogue☆2,381Updated 2 years ago
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.☆8,287Updated 3 years ago
- An open clone of the GPT-2 WebText dataset by OpenAI. Still WIP.☆389Updated 11 months ago
- Library to scrape and clean web pages to create massive datasets.☆2,181Updated 4 years ago
- An implementation of training for GPT2, supports TPUs☆1,425Updated 2 years ago
- Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.☆529Updated 7 months ago
- a bot that generates realistic replies using a combination of pretrained GPT-2 and BERT models☆193Updated 4 years ago
- Plug and Play Language Model implementation. Allows to steer topic and attributes of GPT-2 models.☆1,141Updated last year
- Your new Telegram buddy powered by transformers☆434Updated last year
- ☆1,268Updated 2 years ago
- ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators☆2,350Updated last year