itsuncheng / fine-tuning-GPT2
Codebase for the Medium Article on Fine-tuning GPT2 for Text Generation
☆69Updated 4 years ago
Related projects ⓘ
Alternatives and complementary repositories for fine-tuning-GPT2
- Paraphrase any question with T5 (Text-To-Text Transfer Transformer) - Pretrained model and training script provided☆189Updated last year
- Tutorial for first time BERT users,☆102Updated last year
- The source code of "Language Models are Few-shot Multilingual Learners" (MRL @ EMNLP 2021)☆52Updated 2 years ago
- SUPERT: Unsupervised multi-document summarization evaluation & generation☆91Updated last year
- Resources for the "CTRLsum: Towards Generic Controllable Text Summarization" paper☆146Updated last year
- A repo to explore different NLP tasks which can be solved using T5☆169Updated 3 years ago
- ☆75Updated last year
- A Natural Language Inference (NLI) model based on Transformers (BERT and ALBERT)☆129Updated 9 months ago
- Code from the paper "What do Models Learn from Question Answering Datasets?" (EMNLP 2020)☆55Updated 3 years ago
- pyTorch implementation of Recurrence over BERT (RoBERT) based on this paper https://arxiv.org/abs/1910.10781 and comparison with pyTorch …☆79Updated 2 years ago
- Fine-tuning GPT-2 Small for Question Answering☆130Updated last year
- MoverScore: Text Generation Evaluating with Contextualized Embeddings and Earth Mover Distance☆199Updated last year
- Coreference resolution with different higher-order inference methods; implemented in PyTorch.☆35Updated last year
- A repository for our AAAI-2020 Cross-lingual-NER paper. Code will be updated shortly.