HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Gated Mechanism for Attention Based Multimodal Sentiment Analysis

Ayush Kumar Jithendra Vepa

Gated Mechanism for Attention Based Multimodal Sentiment Analysis

Abstract

Multimodal sentiment analysis has recently gained popularity because of its relevance to social media posts, customer service calls and video blogs. In this paper, we address three aspects of multimodal sentiment analysis; 1. Cross modal interaction learning, i.e. how multiple modalities contribute to the sentiment, 2. Learning long-term dependencies in multimodal interactions and 3. Fusion of unimodal and cross modal cues. Out of these three, we find that learning cross modal interactions is beneficial for this problem. We perform experiments on two benchmark datasets, CMU Multimodal Opinion level Sentiment Intensity (CMU-MOSI) and CMU Multimodal Opinion Sentiment and Emotion Intensity (CMU-MOSEI) corpus. Our approach on both these tasks yields accuracies of 83.9% and 81.1% respectively, which is 1.6% and 1.34% absolute improvement over current state-of-the-art.

Benchmarks

BenchmarkMethodologyMetrics
multimodal-sentiment-analysis-on-cmu-mosei-1Proposed: B2 + B4 w/ multimodal fusion
Accuracy: 81.14
multimodal-sentiment-analysis-on-mosiProposed: B2 + B4 w/ multimodal fusion
Accuracy: 83.91%
F1 score: 81.17

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp