HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Syntactic Multi-view Learning for Open Information Extraction

Kuicai Dong Aixin Sun Jung-Jae Kim Xiaoli Li

Syntactic Multi-view Learning for Open Information Extraction

Abstract

Open Information Extraction (OpenIE) aims to extract relational tuples from open-domain sentences. Traditional rule-based or statistical models have been developed based on syntactic structures of sentences, identified by syntactic parsers. However, previous neural OpenIE models under-explore the useful syntactic information. In this paper, we model both constituency and dependency trees into word-level graphs, and enable neural OpenIE to learn from the syntactic structures. To better fuse heterogeneous information from both graphs, we adopt multi-view learning to capture multiple relationships from them. Finally, the finetuned constituency and dependency representations are aggregated with sentential semantic representations for tuple generation. Experiments show that both constituency and dependency information, and the multi-view learning are effective.

Code Repositories

daviddongkc/smile_oie
Official
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
open-information-extraction-on-lsoie-wikiBERT + Dep-GCN - Const-GCN
F1: 50.21
open-information-extraction-on-lsoie-wikiIMoJIE Kolluru et al. (2020)
F1: 49.24
open-information-extraction-on-lsoie-wikiGloVe + bi-LSTM + CRF
F1: 44.48
open-information-extraction-on-lsoie-wikiBERT Solawetz and Larson (2021)
F1: 47.54
open-information-extraction-on-lsoie-wikiBERT + Dep-GCN [?] Const-GCN
F1: 49.89
open-information-extraction-on-lsoie-wikiCopyAttention Cui et al. (2018)
F1: 39.52
open-information-extraction-on-lsoie-wikiCIGL-OIE + IGL-CA Kolluru et al. (2020)
F1: 44.75
open-information-extraction-on-lsoie-wikiBERT + Dep-GCN
F1: 48.71
open-information-extraction-on-lsoie-wikiBERT + Const-GCN
F1: 49.71
open-information-extraction-on-lsoie-wikiSMiLe-OIE
F1: 51.73
open-information-extraction-on-lsoie-wikiGloVe + bi-LSTM Stanovsky et al. (2018)
F1: 43.9

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Syntactic Multi-view Learning for Open Information Extraction | Papers | HyperAI