HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Why Attention? Analyze BiLSTM Deficiency and Its Remedies in the Case of NER

Peng-Hsuan Li; Tsu-Jui Fu; Wei-Yun Ma

Why Attention? Analyze BiLSTM Deficiency and Its Remedies in the Case of NER

Abstract

BiLSTM has been prevalently used as a core module for NER in a sequence-labeling setup. State-of-the-art approaches use BiLSTM with additional resources such as gazetteers, language-modeling, or multi-task supervision to further improve NER. This paper instead takes a step back and focuses on analyzing problems of BiLSTM itself and how exactly self-attention can bring improvements. We formally show the limitation of (CRF-)BiLSTM in modeling cross-context patterns for each word -- the XOR limitation. Then, we show that two types of simple cross-structures -- self-attention and Cross-BiLSTM -- can effectively remedy the problem. We test the practical impacts of the deficiency on real-world NER datasets, OntoNotes 5.0 and WNUT 2017, with clear and consistent improvements over the baseline, up to 8.7% on some of the multi-token entity mentions. We give in-depth analyses of the improvements across several aspects of NER, especially the identification of multi-token mentions. This study should lay a sound foundation for future improvements on sequence-labeling NER. (Source codes: https://github.com/jacobvsdanniel/cross-ner)

Code Repositories

jacobvsdanniel/cross-ner
Official
tf
Mentioned in GitHub
jacobvsdanniel/cross_ner
Official
tf
Mentioned in GitHub
ckiplab/ckiptagger
tf
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
named-entity-recognition-ner-on-ontonotes-v5Att-BiLSTM-CNN
F1: 88.4
Precision: 88.71
Recall: 88.11
named-entity-recognition-on-wnut-2017Cross-BiLSTM-CNN
F1: 42.85
Precision: 58.28
Recall: 33.92

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Why Attention? Analyze BiLSTM Deficiency and Its Remedies in the Case of NER | Papers | HyperAI