• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2024, Vol. 46 ›› Issue (01): 179-190.

• Artificial Intelligence and Data Mining • Previous Articles    

Multi-domain sentiment analysis of Chinese text based on prompt tuning

ZHAO Wen-hui1,WU Xiao-ling1,LING Jie1,HOON Heo2   

  1. (1.School of Computer Science and Technology,Guangdong University of Technology,Guangzhou 510006,China;
    2.Samsung Electro-mechanics,Suwon 16674,Korea)
  • Received:2022-06-20 Revised:2022-12-14 Accepted:2024-01-25 Online:2024-01-25 Published:2024-01-15

Abstract: The expression of sentiment texts in different domains are different, so it is usually necessary to train the corresponding sentiment analysis model for each domain. In order to solve the problem that one model cannot be used for multi-domain sentiment analysis, this paper proposes a multi-domain text sentiment analysis method based on prompt tuning, called MSAPT. With the help of hard prompts, indicating the domain of the emotional text and the selected emotional labels, the model is prompted to draw on its knowledge of different domain sentiment analysis. Then, a unified "generalized model" is pretrained for sentimental analysis. In downstream learning of various domain texts, the model is frozen and prompt tuning is used to make the model learn the characteristics of emotional text in each downstream domain. MSAPT only requires saving a model and some prompts with far fewer parameters than the model for multi-domain sentiment analysis. Experiments were conducted using multiple datasets of emotional text in different fields, and the results show that MSAPT outperforms model fine-tuning when only prompted tuning is applied. Finally, the length of prompt tuning, hard prompt adapted to specific domains, soft prompt and the size of  intermediate training dataset are ablated respectively, to prove their impact on the effectiveness of sentiment analysis.

Key words: multi-domain sentiment analysis, prompt tuning, pre-trained language model(PLM), T5