shawwn / gpt-2
Code for the paper "Language Models are Unsupervised Multitask Learners"
☆109Updated 3 years ago
Alternatives and similar repositories for gpt-2:
Users that are interested in gpt-2 are comparing it to the libraries listed below
- Set up the CTRL text-generating model on Google Compute Engine with just a few console commands.☆151Updated 5 years ago
- Text-generation API via GPT-2 for Cloud Run☆316Updated 3 years ago
- interactive explorer for language models☆132Updated 2 years ago
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆18Updated 3 years ago
- train gpt-2 in colab☆13Updated 5 years ago
- This is a reddit bot based on OpenAi's GPT-2 117M model☆102Updated 5 years ago
- Weird A.I. Yankovic neural-net based lyrics parody generator☆84Updated 2 years ago
- A client for OpenAI's GPT-3 API for ad hoc testing of prompt without using the web interface.☆90Updated 4 years ago
- Generating poetry with GPT-2.☆89Updated 5 years ago
- Training neural networks to communicate with a visual language☆75Updated last year
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆20Updated 5 years ago
- Poetry generator by gpt-2 with meter and rhyme constraints.☆53Updated 3 years ago
- Lazy, a tool for running things in idle time☆48Updated 3 years ago
- A (relatively) minimal configuration app to run Twitter bots on a schedule that can scale to unlimited bots.☆77Updated 3 years ago
- Babysit your preemptible TPUs☆85Updated 2 years ago
- Method to encode text for GPT-2 to generate text based on provided keywords☆260Updated 3 years ago
- ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝☆372Updated 3 years ago
- Fine tune GPT-2 with your favourite authors☆71Updated 10 months ago
- Test prompts for GPT-J-6B and the resulting AI-generated texts☆53Updated 3 years ago
- Transformer language model (GPT-2) with sentencepiece tokenizer☆164Updated 3 years ago
- ☆58Updated 5 years ago
- ☆28Updated last year
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆104Updated 3 years ago
- Dump of generated texts from GPT-2 trained on Hacker News titles☆117Updated 5 years ago
- A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+☆37Updated 3 years ago
- Hidden Engrams: Long Term Memory for Transformer Model Inference☆34Updated 3 years ago
- Fine tuning experiments for the GPT-2 model by OpenAI.☆20Updated 5 years ago
- Neural Paiters☆78Updated 4 years ago