p208p2002 / Transformer-QG-on-SQuADLinks
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
☆48Updated 2 years ago
Alternatives and similar repositories for Transformer-QG-on-SQuAD
Users that are interested in Transformer-QG-on-SQuAD are comparing it to the libraries listed below
Sorting:
- Code for "A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies."☆27Updated 3 years ago
- Data and code for paper "EQG-RACE: Examination-Type Question Generation" at AAAI2021.☆27Updated 3 years ago
- ☆21Updated 2 years ago
- ☆14Updated 3 years ago
- A dataset of over 10000 question and answer pairs written for storybooks.☆40Updated 2 years ago
- ☆33Updated last year
- Code for EMNLP2020 paper: "Tell Me How to Ask Again: Question Data Augmentation with Controllable Rewriting in Continuous Space"☆26Updated 4 years ago
- Dataset for NAACL 2021 paper: "QMSum: A New Benchmark for Query-based Multi-domain Meeting Summarization"☆124Updated last year
- Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer (ACL 2021)☆30Updated 2 years ago