HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Context-Aware Representations for Knowledge Base Relation Extraction

{Iryna Gurevych Daniil Sorokin}

Context-Aware Representations for Knowledge Base Relation Extraction

Abstract

We demonstrate that for sentence-level relation extraction it is beneficial to consider other relations in the sentential context while predicting the target relation. Our architecture uses an LSTM-based encoder to jointly learn representations for all relations in a single sentence. We combine the context representations with an attention mechanism to make the final prediction. We use the Wikidata knowledge base to construct a dataset of multiple relations per sentence and to evaluate our approach. Compared to a baseline system, our method results in an average error reduction of 24 on a held-out set of relations. The code and the dataset to replicate the experiments are made available at url{https://github.com/ukplab/}.

Benchmarks

BenchmarkMethodologyMetrics
relation-extraction-on-wikipedia-wikidataContextAtt
Error rate: 0.1590

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Context-Aware Representations for Knowledge Base Relation Extraction | Papers | HyperAI