Computer Engineering & Science ›› 2024, Vol. 46 ›› Issue (01): 179-190.
• Artificial Intelligence and Data Mining • Previous Articles
ZHAO Wen-hui1,WU Xiao-ling1,LING Jie1,HOON Heo2
Received:
Revised:
Accepted:
Online:
Published:
Abstract: The expression of sentiment texts in different domains are different, so it is usually necessary to train the corresponding sentiment analysis model for each domain. In order to solve the problem that one model cannot be used for multi-domain sentiment analysis, this paper proposes a multi-domain text sentiment analysis method based on prompt tuning, called MSAPT. With the help of hard prompts, indicating the domain of the emotional text and the selected emotional labels, the model is prompted to draw on its knowledge of different domain sentiment analysis. Then, a unified "generalized model" is pretrained for sentimental analysis. In downstream learning of various domain texts, the model is frozen and prompt tuning is used to make the model learn the characteristics of emotional text in each downstream domain. MSAPT only requires saving a model and some prompts with far fewer parameters than the model for multi-domain sentiment analysis. Experiments were conducted using multiple datasets of emotional text in different fields, and the results show that MSAPT outperforms model fine-tuning when only prompted tuning is applied. Finally, the length of prompt tuning, hard prompt adapted to specific domains, soft prompt and the size of intermediate training dataset are ablated respectively, to prove their impact on the effectiveness of sentiment analysis.
Key words: multi-domain sentiment analysis, prompt tuning, pre-trained language model(PLM), T5
ZHAO Wen-hui, WU Xiao-ling, LING Jie, HOON Heo. Multi-domain sentiment analysis of Chinese text based on prompt tuning[J]. Computer Engineering & Science, 2024, 46(01): 179-190.
0 / / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://joces.nudt.edu.cn/EN/
http://joces.nudt.edu.cn/EN/Y2024/V46/I01/179