HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Contrastive Triple Extraction with Generative Transformer

Hongbin Ye Ningyu Zhang Shumin Deng Mosha Chen Chuanqi Tan Fei Huang Huajun Chen

Contrastive Triple Extraction with Generative Transformer

Abstract

Triple extraction is an essential task in information extraction for natural language processing and knowledge graph construction. In this paper, we revisit the end-to-end triple extraction task for sequence generation. Since generative triple extraction may struggle to capture long-term dependencies and generate unfaithful triples, we introduce a novel model, contrastive triple extraction with a generative transformer. Specifically, we introduce a single shared transformer module for encoder-decoder-based generation. To generate faithful results, we propose a novel triplet contrastive training object. Moreover, we introduce two mechanisms to further improve model performance (i.e., batch-wise dynamic attention-masking and triple-wise calibration). Experimental results on three datasets (i.e., NYT, WebNLG, and MIE) show that our approach achieves better performance than that of baselines.

Benchmarks

BenchmarkMethodologyMetrics
relation-extraction-on-webnlgCGT(UniLM)
F1: 83.4

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Contrastive Triple Extraction with Generative Transformer | Papers | HyperAI