HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models

Ryokan Ri; Ikuya Yamada; Yoshimasa Tsuruoka

mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models

Abstract

Recent studies have shown that multilingual pretrained language models can be effectively improved with cross-lingual alignment information from Wikipedia entities. However, existing methods only exploit entity information in pretraining and do not explicitly use entities in downstream tasks. In this study, we explore the effectiveness of leveraging entity representations for downstream cross-lingual tasks. We train a multilingual language model with 24 languages with entity representations and show the model consistently outperforms word-based pretrained models in various cross-lingual transfer tasks. We also analyze the model and the key insight is that incorporating entity representations into the input allows us to extract more language-agnostic features. We also evaluate the model with a multilingual cloze prompt task with the mLAMA dataset. We show that entity-based prompt elicits correct factual knowledge more likely than using only word representations. Our source code and pretrained models are available at https://github.com/studio-ousia/luke.

Benchmarks

BenchmarkMethodologyMetrics
cross-lingual-question-answering-on-xquadmLUKE-E
Average F1: 74.2

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models | Papers | HyperAI