ConnorJL / GPT2
An implementation of training for GPT2, supports TPUs
☆1,418Updated last year
Related projects: ⓘ
- Conditional Transformer Language Model for Controllable Generation☆1,867Updated 2 years ago
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,402Updated last year
- Toolkit for Machine Learning, Natural Language Processing, and Text Generation, in TensorFlow. This is part of the CASL project: http://…☆2,387Updated 3 years ago
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,930Updated 9 months ago
- Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation☆963Updated 5 years ago
- Code and model for the paper "Improving Language Understanding by Generative Pre-Training"☆2,139Updated 5 years ago
- XLNet: Generalized Autoregressive Pretraining for Language Understanding☆6,175Updated last year
- ☆3,600Updated last year
- PyTorch original implementation of Cross-lingual Language Model Pretraining.☆2,872Updated last year
- ALBERT: A Lite BERT for Self-supervised Learning of Language Representations☆3,233Updated last year
- Single Headed Attention RNN - "Stop thinking with your head"☆1,177Updated 2 years ago
- The Natural Language Decathlon: A Multitask Challenge for NLP☆2,343Updated 8 months ago
- NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego☆1,447Updated last year
- ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝☆372Updated 3 years ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆1,145Updated last year
- 🐥A PyTorch implementation of OpenAI's finetuned transformer language model with a script to import the weights pre-trained by OpenAI☆1,506Updated 3 years ago
- NLP made easy☆2,554Updated 11 months ago
- This dataset code generates mathematical question and answer pairs, from a range of question types at roughly school-level difficulty.☆1,777Updated last month
- Phrase-Based & Neural Unsupervised Machine Translation☆1,507Updated 3 years ago
- Implementation of BERT that could load official pre-trained models for feature extraction and prediction☆2,429Updated 2 years ago
- ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators☆2,321Updated 5 months ago
- Lingvo☆2,810Updated last week
- MASS: Masked Sequence to Sequence Pre-training for Language Generation☆1,115Updated last year
- Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.☆3,016Updated 7 months ago
- 🦄 State-of-the-Art Conversational AI with Transfer Learning☆1,734Updated last year
- Neural machine translation and sequence learning using TensorFlow☆1,451Updated 11 months ago
- DELTA is a deep learning based natural language and speech processing platform.☆1,590Updated 5 months ago
- Language-Agnostic SEntence Representations☆3,576Updated 4 months ago
- Plug and Play Language Model implementation. Allows to steer topic and attributes of GPT-2 models.☆1,128Updated 6 months ago
- a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.☆1,466Updated last year