Multi‐level Deep Correlative Networks for Multi‐modal Sentiment Analysis. Issue 6 (1st November 2020)
- Record Type:
- Journal Article
- Title:
- Multi‐level Deep Correlative Networks for Multi‐modal Sentiment Analysis. Issue 6 (1st November 2020)
- Main Title:
- Multi‐level Deep Correlative Networks for Multi‐modal Sentiment Analysis
- Authors:
- Cai, Guoyong
Lyu, Guangrui
Lin, Yuming
Wen, Yimin - Abstract:
- Abstract : Multi‐modal sentiment analysis (MSA) is increasingly becoming a hotspot because it extends the conventional Sentiment analysis (SA) based on texts to multi‐modal content which can provide richer affective information. However, compared with textbased sentiment analysis, multi‐modal sentiment analysis has much more challenges, because the joint learning process on multi‐modal data requires both fine‐grained semantic matching and effective heterogeneous feature fusion. Existing approaches generally infer sentiment type from splicing features extracted from different modalities but neglect the strong semantic correlation among cooccurrence data of different modalities. To solve the challenges, a multi‐level deep correlative network for multimodal sentiment analysis is proposed, which can reduce the semantic gap by analyzing simultaneously the middlelevel semantic features of images and the hierarchical deep correlations. First, the most relevant cross‐modal feature representation is generated with Multi‐modal Deep and discriminative correlation analysis (Multi‐DDCA) while keeping those respective modal feature representations to be discriminative. Second, the high‐level semantic outputs from multi‐modal deep and discriminative correlation analysis are encoded into attention‐correlation cross‐modal feature representation through a co‐attention‐based multimodal correlation submodel, and then they are further merged by multi‐layer neural network to train a sentimentAbstract : Multi‐modal sentiment analysis (MSA) is increasingly becoming a hotspot because it extends the conventional Sentiment analysis (SA) based on texts to multi‐modal content which can provide richer affective information. However, compared with textbased sentiment analysis, multi‐modal sentiment analysis has much more challenges, because the joint learning process on multi‐modal data requires both fine‐grained semantic matching and effective heterogeneous feature fusion. Existing approaches generally infer sentiment type from splicing features extracted from different modalities but neglect the strong semantic correlation among cooccurrence data of different modalities. To solve the challenges, a multi‐level deep correlative network for multimodal sentiment analysis is proposed, which can reduce the semantic gap by analyzing simultaneously the middlelevel semantic features of images and the hierarchical deep correlations. First, the most relevant cross‐modal feature representation is generated with Multi‐modal Deep and discriminative correlation analysis (Multi‐DDCA) while keeping those respective modal feature representations to be discriminative. Second, the high‐level semantic outputs from multi‐modal deep and discriminative correlation analysis are encoded into attention‐correlation cross‐modal feature representation through a co‐attention‐based multimodal correlation submodel, and then they are further merged by multi‐layer neural network to train a sentiment classifier for predicting sentimental categories. Extensive experimental results on five datasets demonstrate the effectiveness of the designed approach, which outperforms several state‐of‐the‐art fusion strategies for sentiment analysis. … (more)
- Is Part Of:
- Chinese journal of electronics. Volume 29:Issue 6(2020)
- Journal:
- Chinese journal of electronics
- Issue:
- Volume 29:Issue 6(2020)
- Issue Display:
- Volume 29, Issue 6 (2020)
- Year:
- 2020
- Volume:
- 29
- Issue:
- 6
- Issue Sort Value:
- 2020-0029-0006-0000
- Page Start:
- 1025
- Page End:
- 1038
- Publication Date:
- 2020-11-01
- Subjects:
- feature extraction -- image fusion -- image representation -- learning (artificial intelligence) -- pattern classification -- text analysis
attention‐correlation cross‐modal feature representation -- multilayer neural network -- Multilevel Deep correlative networks -- conventional Sentiment analysis -- multimodal content -- text‐based sentiment analysis -- respective modal feature representations -- discriminative correlation analysis -- Multimodal Deep -- relevant cross‐modal feature representation -- multimodal sentiment analysis -- multilevel deep correlative network -- multimodal data
Multi‐modal sentiment analysis -- Multilevel deep correlation network -- Discriminant correlation analysis -- Co‐attention
Electronics -- Periodicals
Electronics -- China -- Periodicals
Electronics
China
Periodicals
621.38105 - Journal URLs:
- https://ietresearch.onlinelibrary.wiley.com/journal/20755597 ↗
http://ieeexplore.ieee.org/servlet/opac?punumber=7479413 ↗
http://ieeexplore.ieee.org/Xplore/home.jsp ↗ - DOI:
- 10.1049/cje.2020.09.003 ↗
- Languages:
- English
- ISSNs:
- 1022-4653
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - 3180.317180
British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 16449.xml