HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction

Masahiro Kaneko Masato Mita Shun Kiyono Jun Suzuki Kentaro Inui

Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction

Abstract

This paper investigates how to effectively incorporate a pre-trained masked language model (MLM), such as BERT, into an encoder-decoder (EncDec) model for grammatical error correction (GEC). The answer to this question is not as straightforward as one might expect because the previous common methods for incorporating a MLM into an EncDec model have potential drawbacks when applied to GEC. For example, the distribution of the inputs to a GEC model can be considerably different (erroneous, clumsy, etc.) from that of the corpora used for pre-training MLMs; however, this issue is not addressed in the previous methods. Our experiments show that our proposed method, where we first fine-tune a MLM with a given GEC corpus and then use the output of the fine-tuned MLM as additional features in the GEC model, maximizes the benefit of the MLM. The best-performing model achieves state-of-the-art performances on the BEA-2019 and CoNLL-2014 benchmarks. Our code is publicly available at: https://github.com/kanekomasahiro/bert-gec.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
grammatical-error-correction-on-bea-2019-testTransformer + Pre-train with Pseudo Data (+BERT)
F0.5: 69.8
grammatical-error-correction-on-conll-2014Transformer + Pre-train with Pseudo Data (+BERT)
F0.5: 65.2
grammatical-error-correction-on-jflegTransformer + Pre-train with Pseudo Data + BERT
GLEU: 62.0

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction | Papers | HyperAI