Command Palette
Search for a command to run...
Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations
Christian Hadiwinoto Hwee Tou Ng Wee Chung Gan

Abstract
Contextualized word representations are able to give different representations for the same word in different contexts, and they have been shown to be effective in downstream natural language processing tasks, such as question answering, named entity recognition, and sentiment analysis. However, evaluation on word sense disambiguation (WSD) in prior work shows that using contextualized word representations does not outperform the state-of-the-art approach that makes use of non-contextualized word embeddings. In this paper, we explore different strategies of integrating pre-trained contextualized word representations and our best strategy achieves accuracies exceeding the best prior published accuracies by significant margins on multiple benchmark WSD datasets. We make the source code available at https://github.com/nusnlp/contextemb-wsd.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| word-sense-disambiguation-on-supervised | BERT (nearest neighbour) | SemEval 2007: 63.3 SemEval 2013: 69.2 SemEval 2015: 74.4 Senseval 2: 73.8 Senseval 3: 71.6 |
| word-sense-disambiguation-on-supervised | BERT (linear projection) | SemEval 2007: 68.1 SemEval 2013: 71.1 SemEval 2015: 76.2 Senseval 2: 75.5 Senseval 3: 73.6 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.