Command Palette
Search for a command to run...
Qian Chen; Zhu Zhuo; Wen Wang

Abstract
Intent classification and slot filling are two essential tasks for natural language understanding. They often suffer from small-scale human-labeled training data, resulting in poor generalization capability, especially for rare words. Recently a new language representation model, BERT (Bidirectional Encoder Representations from Transformers), facilitates pre-training deep bidirectional representations on large-scale unlabeled corpora, and has created state-of-the-art models for a wide variety of natural language processing tasks after simple fine-tuning. However, there has not been much effort on exploring BERT for natural language understanding. In this work, we propose a joint intent classification and slot filling model based on BERT. Experimental results demonstrate that our proposed model achieves significant improvement on intent classification accuracy, slot filling F1, and sentence-level semantic frame accuracy on several public benchmark datasets, compared to the attention-based recurrent neural network models and slot-gated models.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| intent-detection-on-atis | Joint BERT + CRF | Accuracy: 97.9 |
| intent-detection-on-atis | Joint BERT | Accuracy: 97.5 |
| slot-filling-on-atis | Joint BERT + CRF | F1: 0.96 |
| slot-filling-on-atis | Joint BERT | F1: 0.961 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.