HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Sparsity Makes Sense: Word Sense Disambiguation Using Sparse Contextualized Word Representations

{G{\'a}bor Berend}

Sparsity Makes Sense: Word Sense Disambiguation Using Sparse Contextualized Word Representations

Abstract

In this paper, we demonstrate that by utilizing sparse word representations, it becomes possible to surpass the results of more complex task-specific models on the task of fine-grained all-words word sense disambiguation. Our proposed algorithm relies on an overcomplete set of semantic basis vectors that allows us to obtain sparse contextualized word representations. We introduce such an information theory-inspired synset representation based on the co-occurrence of word senses and non-zero coordinates for word forms which allows us to achieve an aggregated F-score of 78.8 over a combination of five standard word sense disambiguating benchmark datasets. We also demonstrate the general applicability of our proposed framework by evaluating it towards part-of-speech tagging on four different treebanks. Our results indicate a significant improvement over the application of the dense word representations.

Benchmarks

BenchmarkMethodologyMetrics
word-sense-disambiguation-on-supervisedSparseLMMS+WNGC
SemEval 2007: 73.0
SemEval 2013: 79.4
SemEval 2015: 81.3
Senseval 2: 79.6
Senseval 3: 77.3
word-sense-disambiguation-on-supervisedSparseLMMS
SemEval 2007: 68.8
SemEval 2013: 76.1
SemEval 2015: 77.5
Senseval 2: 77.9
Senseval 3: 77.8

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Sparsity Makes Sense: Word Sense Disambiguation Using Sparse Contextualized Word Representations | Papers | HyperAI