HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

A Recurrent BERT-based Model for Question Generation

{Yao-Chung Fan Ying-Hong Chan}

A Recurrent BERT-based Model for Question Generation

Abstract

In this study, we investigate the employment of the pre-trained BERT language model to tackle question generation tasks. We introduce three neural architectures built on top of BERT for question generation tasks. The first one is a straightforward BERT employment, which reveals the defects of directly using BERT for text generation. Accordingly, we propose another two models by restructuring our BERT employment into a sequential manner for taking information from previous decoded results. Our models are trained and evaluated on the recent question-answering dataset SQuAD. Experiment results show that our best model yields state-of-the-art performance which advances the BLEU 4 score of the existing best models from 16.85 to 22.17.

Benchmarks

BenchmarkMethodologyMetrics
question-generation-on-squad11BERTSQG
BLEU-4: 22.17

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
A Recurrent BERT-based Model for Question Generation | Papers | HyperAI