HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Jointly Learning to Label Sentences and Tokens

Marek Rei; Anders Søgaard

Jointly Learning to Label Sentences and Tokens

Abstract

Learning to construct text representations in end-to-end systems can be difficult, as natural languages are highly compositional and task-specific annotated datasets are often limited in size. Methods for directly supervising language composition can allow us to guide the models based on existing knowledge, regularizing them towards more robust and interpretable representations. In this paper, we investigate how objectives at different granularities can be used to learn better language representations and we propose an architecture for jointly learning to label sentences and tokens. The predictions at each level are combined together using an attention mechanism, with token-level labels also acting as explicit supervision for composing sentence-level representations. Our experiments show that by learning to perform these tasks jointly on multiple levels, the model achieves substantial improvements for both sentence classification and sequence labeling.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
grammatical-error-detection-on-conll-2014-a1BiLSTM-JOINT (trained on FCE)
F0.5: 22.14
grammatical-error-detection-on-conll-2014-a2BiLSTM-JOINT (trained on FCE)
F0.5: 29.65
grammatical-error-detection-on-fceBiLSTM-JOINT
F0.5: 52.07
grammatical-error-detection-on-jflegBiLSTM-JOINT (trained on FCE)
F0.5: 52.52

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp