HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Zero-Resource Cross-Lingual Named Entity Recognition

M Saiful Bari; Shafiq Joty; Prathyusha Jwalapuram

Zero-Resource Cross-Lingual Named Entity Recognition

Abstract

Recently, neural methods have achieved state-of-the-art (SOTA) results in Named Entity Recognition (NER) tasks for many languages without the need for manually crafted features. However, these models still require manually annotated training data, which is not available for many languages. In this paper, we propose an unsupervised cross-lingual NER model that can transfer NER knowledge from one language to another in a completely unsupervised way without relying on any bilingual dictionary or parallel data. Our model achieves this through word-level adversarial learning and augmented fine-tuning with parameter sharing and feature augmentation. Experiments on five different languages demonstrate the effectiveness of our approach, outperforming existing models by a good margin and setting a new SOTA for each language pair.

Code Repositories

ntunlp/Zero-Shot-Cross-Lingual-NER
Official
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
low-resource-named-entity-recognition-on-4Zero-Resource Transfer From CoNLL-2003 English dataset.
F1 score: 65.24
low-resource-named-entity-recognition-on-5Zero-Resource Cross-lingual Transfer From CoNLL-2003 English dataset.
F1 score: 75.93
low-resource-named-entity-recognition-on-6Zero-Resource Transfer From CoNLL-2003 English dataset.
F1 score: 74.61

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Zero-Resource Cross-Lingual Named Entity Recognition | Papers | HyperAI