ADGEfficiency / creative-writing-with-gpt2Links
Fine tune GPT-2 with your favourite authors
☆72Updated last year
Alternatives and similar repositories for creative-writing-with-gpt2
Users that are interested in creative-writing-with-gpt2 are comparing it to the libraries listed below
Sorting:
- a bot that generates realistic replies using a combination of pretrained GPT-2 and BERT models☆193Updated 4 years ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆108Updated 4 years ago
- This is a reddit bot based on OpenAi's GPT-2 117M model☆101Updated 6 years ago
- Google's Meena transformer chatbot implementation☆105Updated 3 years ago
- Method to encode text for GPT-2 to generate text based on provided keywords☆260Updated 4 years ago
- ☆50Updated 2 years ago
- OpenAI GPT-2 Flask API☆52Updated 6 years ago
- ☆27Updated 2 years ago
- Simple Python client for the Hugging Face Inference API☆75Updated 5 years ago
- Hybrid Conversational Bot based on both neural retrieval and neural generative mechanism with TTS.☆83Updated 2 years ago
- Generating paper titles (and more!) with GPT trained on data scraped from arXiv.☆149Updated 2 years ago
- Many Natural Language Processing tasks rely on sentence boundary detection (SBD). Although amazing libraries like spacy provide state of …☆61Updated 5 years ago
- Simple Annotated implementation of GPT-NeoX in PyTorch☆110Updated 3 years ago
- Reproducing "Writing with Transformer" demo, using aitextgen/FastAPI in backend, Quill/React in frontend☆28Updated 4 years ago
- Quotes generating bot☆35Updated 2 years ago
- A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+☆37Updated 4 years ago
- GPT-2 User Interface based on HuggingFace's Pytorch Implementation☆56Updated last year
- A simple approach to use GPT2-medium (345M) for generating high quality text summaries with minimal training.☆156Updated 2 years ago
- Agents that build knowledge graphs and explore textual worlds by asking questions☆79Updated 2 years ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆107Updated 4 years ago
- Fine-tuning GPT-2 Small for Question Answering☆130Updated 2 years ago
- Paraphrase Generation model using pair-wise discriminator loss☆46Updated 4 years ago
- Paraphrase any question with T5 (Text-To-Text Transfer Transformer) - Pretrained model and training script provided☆185Updated 2 years ago
- Reading comprehension with ALBERT transformer model☆15Updated 3 years ago
- Code for obtaining the Curation Corpus abstractive text summarisation dataset☆127Updated 4 years ago
- Python script to download public Tweets from a given Twitter account into a format suitable for AI text generation.☆226Updated 5 years ago
- 📄Neural Sentential Paraphrase Generation to Augment Chatbot Training Dataset☆21Updated 2 years ago
- ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝☆372Updated 4 years ago
- Datasets I have created for scientific summarization, and a trained BertSum model☆115Updated 5 years ago
- Using BERT for doing the task of Conditional Natural Language Generation by fine-tuning pre-trained BERT on custom dataset.☆41Updated 5 years ago