polakowo / gpt2bot
Your new Telegram buddy powered by transformers
☆429Updated 8 months ago
Related projects ⓘ
Alternatives and complementary repositories for gpt2bot
- GPT-2 Telegram Chat bot☆69Updated last year
- Hybrid Conversational Bot based on both neural retrieval and neural generative mechanism with TTS.☆82Updated last year
- Generating responses with pretrained XLNet and GPT-2 in PyTorch.☆118Updated 2 years ago
- EMNLP 2020: "Dialogue Response Ranking Training with Large-Scale Human Feedback Data"☆336Updated last week
- A simple approach to use GPT2-medium (345M) for generating high quality text summaries with minimal training.☆157Updated last year
- OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0☆260Updated last year
- A search engine for ParlAI's BlenderBot project (and probably other ones as well)☆132Updated 2 years ago
- ☆33Updated 3 years ago
- Fine-tuning GPT-2 Small for Question Answering☆130Updated last year
- Method to encode text for GPT-2 to generate text based on provided keywords☆259Updated 3 years ago
- Script to interact with the DialoGPT models.☆52Updated 5 years ago
- Google's Meena transformer chatbot implementation☆105Updated 3 years ago
- Text-generation API via GPT-2 for Cloud Run☆315Updated 3 years ago
- Paraphrase any question with T5 (Text-To-Text Transfer Transformer) - Pretrained model and training script provided☆189Updated last year
- Large-scale pretraining for dialogue☆2,361Updated 2 years ago
- A GPT-J API to use with python3 to generate text, blogs, code, and more☆204Updated 2 years ago
- a simple bot that allows you to chat with various personas☆71Updated 4 years ago
- a bot that generates realistic replies using a combination of pretrained GPT-2 and BERT models☆190Updated 3 years ago
- ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝☆372Updated 3 years ago
- Large datasets for conversational AI☆1,294Updated 5 years ago
- Fine-tuned pre-trained GPT2 for custom topic specific text generation. Such system can be used for Text Augmentation.☆188Updated last year
- ☆32Updated 2 years ago
- Fine tune GPT-2 with your favourite authors☆71Updated 8 months ago
- An open clone of the GPT-2 WebText dataset by OpenAI. Still WIP.☆385Updated 7 months ago
- API for the GPT-J language model 🦜. Including a FastAPI backend and a streamlit frontend☆338Updated 3 years ago
- Replika.ai Research Papers, Posters, Slides & Datasets☆373Updated 2 years ago
- Sentence paraphrase generation at the sentence level☆407Updated last year
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆432Updated last year