HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies

Ho-Lam Chung; Ying-Hong Chan; Yao-Chung Fan

A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies

Abstract

In this paper, we investigate the following two limitations for the existing distractor generation (DG) methods. First, the quality of the existing DG methods are still far from practical use. There is still room for DG quality improvement. Second, the existing DG designs are mainly for single distractor generation. However, for practical MCQ preparation, multiple distractors are desired. Aiming at these goals, in this paper, we present a new distractor generation scheme with multi-tasking and negative answer training strategies for effectively generating \textit{multiple} distractors. The experimental results show that (1) our model advances the state-of-the-art result from 28.65 to 39.81 (BLEU 1 score) and (2) the generated multiple distractors are diverse and show strong distracting power for multiple choice question.

Code Repositories

voidful/BDG
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
distractor-generation-on-raceBDG p.m.
BLEU-1: 39.81
BLEU-2: 24.81
BLEU-3: 17.66
BLEU-4: 13.56
ROUGE-L: 34.01

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies | Papers | HyperAI