[1] |
Kalchbrenner N, Grefenstette E, Blunsom P. A convolutional neural network for modeling sentences[C]∥Proc of the 52nd Annual Meeting of the Association for Computational Linguistics, 2014:655-665.
|
[2] |
Monika R,Deivalakshmi S,Janet B.Sentiment analysis of US airlines tweets using LSTM/RNN[C]∥Proc of 2019 IEEE 9th International Conference on Advanced Computing,2019:92-95.
|
[3] |
Devlin J,Chang M W,Lee K,et al.BERT:Pre-training of deep bidirectional transformers for language understanding[C]∥Proc of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies,2019:4171-4186.
|
[4] |
Liu Z,Lin W,Shi Y,et al.RoBERTa:A robustly optimized BERT pretraining approach[J].arXiv:1907.11692,2019.
|
[5] |
Raffel C,Shazeer N,Roberts A,et al.Exploring the limits of transfer learning with a unified text-to-text transformer[J].Journal of Machine Learning Research,2021,21(140):1-67.
|
[6] |
Howard J,Ruder S.Universal language model fine-tuning for text classification[C]∥Proc of the 56th Annual Meeting of the Association for Computational Linguistics,2018:328-339.
|
[7] |
黄泽民,吴晓鸰,吴迎岗,等.结合BERT和BiSRU-AT的中文文本情感分类[J].计算机工程与科学,2021,43(9):1668-1674.
|
|
Huang Ze-min,Wu Xiao-ling,Wu Ying-gang,et al.Analysis of Chinese text emotions combining BERT and BiSRU-AT[J].Computer Engineering & Science,2021,43(9):1668-1674.
|
[8] |
王春东,张卉,莫秀良,等.微博情感分析综述[J].计算机工程与科学,2022,44(1):165-175.
|
|
Wang Chun-dong,Zhang Hui,Mo Xiu-liang,et al.Overview on sentiment analysis of microblog[J].Computer Engineering & Science,2022,44(1):165-175.
|
[9] |
Man R,Lin K.Sentiment analysis algorithm based on BERT and convolutional neural network[C]∥Proc of 2021 IEEE Asia-Pacific Conference on Image Processing,Electronics and Computers,2021:769-772.
|
[10] |
梅侠峰,吴晓鸰,吴杰文,等.结合ALBERT和BiFASRU-AT的情感分析模型[J].小型微型计算机系统,2023,44(1):36-42.
|
|
Mei Xia-feng,Wu Xiao-ling,Wu Jie-wen,et al.Sentiment analysis model combined with ALBERT and BiFASRU-AT[J].Journal of Chinese Computer Systems,2023,44(1):36-42.
|
[11] |
赵传君,王素格,李德玉.跨领域文本情感分类研究进展[J].软件学报,2020,31(6):1723-1746.
|
|
Zhao Chuan-jun,Wang Su-ge,Li De-yu.Research progress on cross-domain text sentiment classification[J].Journal of Software,2020,31(6):1723-1746.
|
[12] |
杨修远,彭韬,杨亮,等.基于知识蒸馏的自适应多领域情感分析[J].山东大学学报(工学版),2021,51(3):15-21.
|
|
Yang Xiu-yuan, Peng Tao, Yang Liang, et al. Adaptive multi-domain sentiment analysis based on knowledge distillation[J].Journal of Shandong University (Engineering Science),2021,51(3):15-21.
|
[13] |
Brown T,Mann B,Ryder N,et al.Language models are few-shot learners[C]∥Proc of the 34th International Conference on Neural Information Processing Systems,2020:1877-1901.
|
[14] |
Shin T,Razeghi Y,Logan I R L,et al.Auto prompt:Eliciting knowledge from language models with automatically generated prompts[C]∥Proc of the 2020 Conference on Empirical Methods in Natural Language Processing,2020:4222-4235.
|
[15] |
Li X L, Liang P.Prefix-tuning:Optimizing continuous prompts for generation[C]∥Proc of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing,2021:4582-4597.
|
[16] |
Hambardzumyan K,Khachatrian H,May J.WARP:Word-level adversarial reprogramming[C]∥Proc of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing,2021:4921-4933.
|
[17] |
Lester B, Al-Rfou R,Constant N.The power of scale for parameter-efficient prompt tuning [J].arXiv:2104.08691,2021.
|
[18] |
Gu Y,Han X,Liu Z, et al. PPT:Pre-trained prompt tuning for few-shot learning[J].arXiv:2109.04332,2021.
|
[19] |
Vu T,Lester B,Constant N,et al.SPoT:Better frozen model adaptation through soft prompt transfer[J]. arXiv:2110.07904,2021.
|
[20] |
Vaswani A,Shazeer N,Parmar N,et al.Attention is all you need[C]∥Proc of the 31st International Conference on Neural Information Processing Systems,2017:6000-6010.
|
[21] |
Jiang Z,Xu F F,Araki J,et al.How can we know what language models know?[J]. Transactions of the Association for Computational Linguistics,2020,8:423-438.
|
[22] |
Zhong R,Lee K,Zhang Z,et al.Adapting language models for zero-shot learning by meta-tuning on dataset and prompt collections[J].arXiv:2104.04670,2021.
|
[23] |
Zhang Z,Zhang H,Chen K,et al.Mengzi:Towards lightweight yet ingenious pre-trained models for Chinese[J].arXiv:2110.06696,2021.
|
[24] |
Tian H,Gao C,Xiao X,et al.SKEP:Sentiment knowledge enhanced pre-training for sentiment analysis[C]∥Proc of the 58th Annual Meeting of the Association for Computational Linguistics,2020:4067-4076.
|
[25] |
Xu L,Hu H,Zhang X,et al.CLUE:A Chinese language understanding evaluation benchmark[C]∥Proc of the 28th International Conference on Computational Linguistics,2020:4762-4772.
|
[26] |
Cui Y M,Che W X,Liu T,et al.Pre-training with whole word masking for Chinese BERT[J].IEEE/ACM Transactions on Audio,Speech,and Language Processing,2021,29:3504-3514.
|
[27] |
Loshchilov I,Hutter F.Decoupled weight decay regularization[C]∥Proc of the 2019 International Conference on Learning Representations,2019:935-942.
|
[28] |
曲昭伟,王源,王晓茹.基于迁移学习的分层注意力网络情感分析算法[J].计算机应用,2018,38(11):3053-3062.
|
|
Qu Zhao-wei,Wang Yuan,Wang Xiao-ru.Transfer learning based hierarchical attention neural network for sentiment analysis[J].Journal of Computer Applications,2018,38(11):3053-3062.
|
[29] |
余传明.基于深度循环神经网络的跨领域文本情感分析[J].图书情报工作,2018,62(11):23-34.
|
|
Yu Chuan-ming. A cross-domain text sentiment analysis based on deep recurrent neural network[J].Library and Information Service,2018,62(11):23-34.
|