Command Palette
Search for a command to run...
{He-Yan Huang Yang Gao Qian Liu Yuxin Tian Xiaochi Wei Luyang Liu}

Abstract
Distributed word representation plays a pivotal role in various natural language processing tasks. In spite of its success, most existing methods only consider contextual information, which is suboptimal when used in various tasks due to a lack of task-specific features. The rational word embeddings should have the ability to capture both the semantic features and task-specific features of words. In this paper, we propose a task-oriented word embedding method and apply it to the text classification task. With the function-aware component, our method regularizes the distribution of words to enable the embedding space to have a clear classification boundary. We evaluate our method using five text classification datasets. The experiment results show that our method significantly outperforms the state-of-the-art methods.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| sentiment-analysis-on-imdb | ToWE-SG | Accuracy: 90.8 |
| sentiment-analysis-on-sst-2-binary | ToWE-CBOW | Accuracy: 78.8 |
| text-classification-on-ag-news | ToWE-SG | Error: 14.0 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.