HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Attending to Characters in Neural Sequence Labeling Models

Marek Rei; Gamal K.O. Crichton; Sampo Pyysalo

Attending to Characters in Neural Sequence Labeling Models

Abstract

Sequence labeling architectures use word embeddings for capturing similarity, but suffer when handling previously unseen or rare words. We investigate character-level extensions to such models and propose a novel architecture for combining alternative word representations. By using an attention mechanism, the model is able to dynamically decide how much information to use from a word- or character-level component. We evaluated different architectures on a range of sequence labeling datasets, and character-level extensions were found to improve performance on every benchmark. In addition, the proposed attention-based architecture delivered the best results even with a smaller number of trainable parameters.

Benchmarks

BenchmarkMethodologyMetrics
grammatical-error-detection-on-fceBi-LSTM + charattn
F0.5: 41.88
part-of-speech-tagging-on-penn-treebankBi-LSTM + charattn
Accuracy: 97.27

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Attending to Characters in Neural Sequence Labeling Models | Papers | HyperAI