grammatical / pretraining-bea2019

Models, system configurations and outputs of our winning GEC systems in the BEA 2019 shared task described in R. Grundkiewicz, M. Junczys-Dowmunt, K. Heafield: Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic Data, BEA 2019.
50Updated 5 years ago

Alternatives and similar repositories for pretraining-bea2019:

Users that are interested in pretraining-bea2019 are comparing it to the libraries listed below