HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

CTRAN: CNN-Transformer-based Network for Natural Language Understanding

Mehrdad Rafiepour Javad Salimi Sartakhti

CTRAN: CNN-Transformer-based Network for Natural Language Understanding

Abstract

Intent-detection and slot-filling are the two main tasks in natural language understanding. In this study, we propose CTRAN, a novel encoder-decoder CNN-Transformer-based architecture for intent-detection and slot-filling. In the encoder, we use BERT, followed by several convolutional layers, and rearrange the output using window feature sequence. We use stacked Transformer encoders after the window feature sequence. For the intent-detection decoder, we utilize self-attention followed by a linear layer. In the slot-filling decoder, we introduce the aligned Transformer decoder, which utilizes a zero diagonal mask, aligning output tags with input tokens. We apply our network on ATIS and SNIPS, and surpass the current state-of-the-art in slot-filling on both datasets. Furthermore, we incorporate the language model as word embeddings, and show that this strategy yields a better result when compared to the language model as an encoder.

Code Repositories

rafiepour/CTran
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
intent-detection-on-atisCTRAN
Accuracy: 98.07
intent-detection-on-snipsCTRAN
Accuracy: 99.42
slot-filling-on-atisCTRAN
F1: 0.9846
slot-filling-on-snipsCTRAN
F1: 98.3

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
CTRAN: CNN-Transformer-based Network for Natural Language Understanding | Papers | HyperAI