• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2025, Vol. 47 ›› Issue (7): 1321-1330.

• Artificial Intelligence and Data Mining • Previous Articles    

Multimodal aspect-based sentiment analysis based on dual channel graph convolutional network

ZHANG Feng1,SHAO Yubin1,DU Qingzhi1,LONG Hua1,MA Dinan2#br#   

  1. (1.Faculty of lnformation Engineering and Automation,Kunming University of Science and Technology,Kunming 650504;
    2.Yunnan Key Laboratory of Media Convergence,Kunming 650228,China)
  • Received:2023-12-15 Revised:2024-07-17 Online:2025-07-25 Published:2025-08-25

Abstract: In the task of multimodal aspect-based sentiment analysis,traditional methods primarily focus on deep-level cross-modal interactions between images and texts while paying less attention to the aspect-related shallow information within images and texts.This oversight leads to the introduction of aspect-irrelevant noise,thereby limiting the model’s ability to capture the complex relationships between aspects and sentiments.To address this issue,a dual-channel graph convolutional network (DCGCN) model is proposed.Based on the architecture of the BART model,the proposed approach employs an attention mechanism to enhance aspect semantics,leverages graph convolutional networks (GCN) to extract aspect-enhanced multimodal features,and aggregates syntactic dependencies,aspect-based positional dependencies,and aspect-augmented imagetext correlation information into the GCN adjacency weight matrix to obtain multi-information-aware multimodal features.Experiments demonstrate that the proposed model achieves F1 scores of 67.4% and 67.9% on two Twitter datasets,respectively,and can improve the performance of multimodal aspect-based sentiment analysis.

Key words: aspect-based sentiment analysis, multimodal, graph convolutional network, syntactic dependency, attention mechanism