Command Palette
Search for a command to run...
MIDAS: Multi-level Intent, Domain, And Slot Knowledge Distillation for Multi-turn NLU
Li Yan Kim So-Eon Park Seong-Bae Han Soyeon Caren

Abstract
Although Large Language Models (LLMs) can generate coherent text, they oftenstruggle to recognise user intent behind queries. In contrast, Natural LanguageUnderstanding (NLU) models interpret the purpose and key information of userinput for responsive interactions. Existing NLU models typically map utterancesto a dual-level semantic frame, involving sentence-level intent (SI) andword-level slot (WS) labels. However, real-life conversations primarily consistof multi-turn dialogues, requiring the interpretation of complex and extendedexchanges. Researchers encounter challenges in addressing all facets ofmulti-turn dialogue using a unified NLU model. This paper introduces MIDAS, anovel approach leveraging multi-level intent, domain, and slot knowledgedistillation for multi-turn NLU. We construct distinct teachers for SIdetection, WS filling, and conversation-level domain (CD) classification, eachfine-tuned for specific knowledge. A multi-teacher loss is proposed tofacilitate the integration of these teachers, guiding a student model inmulti-turn dialogue tasks. Results demonstrate the efficacy of our model inimproving multi-turn conversation understanding, showcasing the potential foradvancements in NLU through multi-level dialogue knowledge distillation. Ourimplementation is open-sourced on https://github.com/adlnlp/Midas.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| intent-detection-on-dialogue-state-tracking | MIDAS | Accuracy: 94.27 |
| intent-detection-on-multiwoz-2-2 | MIDAS | Accuarcy: 85.02 |
| slot-filling-on-dialogue-state-tracking | MIDAS | F1 score: 98.56 |
| slot-filling-on-multiwoz-2-2 | MIDAS | F1 score: 99.28 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.