HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Coreference Resolution without Span Representations

Yuval Kirstain Ori Ram Omer Levy

Coreference Resolution without Span Representations

Abstract

The introduction of pretrained language models has reduced many complex task-specific NLP models to simple lightweight layers. An exception to this trend is coreference resolution, where a sophisticated task-specific model is appended to a pretrained transformer encoder. While highly effective, the model has a very large memory footprint -- primarily due to dynamically-constructed span and span-pair representations -- which hinders the processing of complete documents and the ability to train on multiple instances in a single batch. We introduce a lightweight end-to-end coreference model that removes the dependency on span representations, handcrafted features, and heuristics. Our model performs competitively with the current standard model, while being simpler and more efficient.

Code Repositories

yuvalkirstain/s2e-coref
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
coreference-resolution-on-conll-2012s2e + Longformer-Large
Avg F1: 80.3
coreference-resolution-on-conll-2012c2f + SpanBERT-Large
Avg F1: 80.2

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Coreference Resolution without Span Representations | Papers | HyperAI