HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

GraPPa: Grammar-Augmented Pre-Training for Table Semantic Parsing

Tao Yu Chien-Sheng Wu Xi Victoria Lin Bailin Wang Yi Chern Tan Xinyi Yang Dragomir Radev Richard Socher Caiming Xiong

GraPPa: Grammar-Augmented Pre-Training for Table Semantic Parsing

Abstract

We present GraPPa, an effective pre-training approach for table semantic parsing that learns a compositional inductive bias in the joint representations of textual and tabular data. We construct synthetic question-SQL pairs over high-quality tables via a synchronous context-free grammar (SCFG) induced from existing text-to-SQL datasets. We pre-train our model on the synthetic data using a novel text-schema linking objective that predicts the syntactic role of a table field in the SQL for each question-SQL pair. To maintain the model's ability to represent real-world data, we also include masked language modeling (MLM) over several existing table-and-language datasets to regularize the pre-training process. On four popular fully supervised and weakly supervised table semantic parsing benchmarks, GraPPa significantly outperforms RoBERTa-large as the feature representation layers and establishes new state-of-the-art results on all of them.

Code Repositories

taoyds/grappa
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
semantic-parsing-on-spiderRATSQL + Grammar-Augmented Pre-Training
Accuracy: 69.6

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
GraPPa: Grammar-Augmented Pre-Training for Table Semantic Parsing | Papers | HyperAI