HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Sense Embedding Learning for Word Sense Induction

Linfeng Song; Zhiguo Wang; Haitao Mi; Daniel Gildea

Sense Embedding Learning for Word Sense Induction

Abstract

Conventional word sense induction (WSI) methods usually represent each instance with discrete linguistic features or cooccurrence features, and train a model for each polysemous word individually. In this work, we propose to learn sense embeddings for the WSI task. In the training stage, our method induces several sense centroids (embedding) for each polysemous word. In the testing stage, our method represents each instance as a contextual vector, and induces its sense by finding the nearest sense centroid in the embedding space. The advantages of our method are (1) distributed sense vectors are taken as the knowledge representations which are trained discriminatively, and usually have better performance than traditional count-based distributional models, and (2) a general model for the whole vocabulary is jointly trained to induce sense centroids under the mutlitask learning framework. Evaluated on SemEval-2010 WSI dataset, our method outperforms all participants and most of the recent state-of-the-art methods. We further verify the two advantages by comparing with carefully designed baselines.

Benchmarks

BenchmarkMethodologyMetrics
word-sense-induction-on-semeval-2010-wsi-1SE-WSI-fix
AVG: 23.24
F-Score: 55.1
V-Measure: 9.8

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Sense Embedding Learning for Word Sense Induction | Papers | HyperAI