HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Stronger Baselines for Grammatical Error Correction Using Pretrained Encoder-Decoder Model

Satoru Katsumata Mamoru Komachi

Stronger Baselines for Grammatical Error Correction Using Pretrained Encoder-Decoder Model

Abstract

Studies on grammatical error correction (GEC) have reported the effectiveness of pretraining a Seq2Seq model with a large amount of pseudodata. However, this approach requires time-consuming pretraining for GEC because of the size of the pseudodata. In this study, we explore the utility of bidirectional and auto-regressive transformers (BART) as a generic pretrained encoder-decoder model for GEC. With the use of this generic pretrained model for GEC, the time-consuming pretraining can be eliminated. We find that monolingual and multilingual BART models achieve high performance in GEC, with one of the results being comparable to the current strong results in English GEC. Our implementations are publicly available at GitHub (https://github.com/Katsumata420/generic-pretrained-GEC).

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
grammatical-error-correction-on-conll-2014BART
F0.5: 63.0
Precision: 69.9
Recall: 45.1

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Stronger Baselines for Grammatical Error Correction Using Pretrained Encoder-Decoder Model | Papers | HyperAI