HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Transformer-based Named Entity Recognition with Combined Data Representation

Michał Marcińczuk

Transformer-based Named Entity Recognition with Combined Data Representation

Abstract

This study examines transformer-based models and their effectiveness in named entity recognition tasks. The study investigates data representation strategies, including single, merged, and context, which respectively use one sentence, multiple sentences, and sentences joined with attention to context per vector. Analysis shows that training models with a single strategy may lead to poor performance on different data representations. To address this limitation, the study proposes a combined training procedure that utilizes all three strategies to improve model stability and adaptability. The results of this approach are presented and discussed for four languages (English, Polish, Czech, and German) across various datasets, demonstrating the effectiveness of the combined strategy.

Benchmarks

BenchmarkMethodologyMetrics
named-entity-recognition-ner-on-conll-2003XLM-RoBERTa-large union
F1: 93.69

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Transformer-based Named Entity Recognition with Combined Data Representation | Papers | HyperAI