Command Palette
Search for a command to run...
Giving BERT a Calculator: Finding Operations and Arguments with Reading Comprehension
Daniel Andor; Luheng He; Kenton Lee; Emily Pitler

Abstract
Reading comprehension models have been successfully applied to extractive text answers, but it is unclear how best to generalize these models to abstractive numerical answers. We enable a BERT-based reading comprehension model to perform lightweight numerical reasoning. We augment the model with a predefined set of executable 'programs' which encompass simple arithmetic as well as extraction. Rather than having to learn to manipulate numbers directly, the model can pick a program and execute it. On the recent Discrete Reasoning Over Passages (DROP) dataset, designed to challenge reading comprehension models, we show a 33% absolute improvement by adding shallow programs. The model can learn to predict new operations when appropriate in a math word problem setting (Roy and Roth, 2015) with very few training examples.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| question-answering-on-drop-test | BERT+Calculator (ensemble) | F1: 81.78 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.