• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2020, Vol. 42 ›› Issue (11): 2059-2066.

Previous Articles     Next Articles

Entity relationship extraction fusing self-attention mechanism and CNN

YAN Xiong,DUAN Yuexing,ZHANG Zehua   

  1. (College of Information and Computer,Taiyuan University of Technology,Jinzhong 030600,China)
  • Received:2019-09-26 Revised:2020-03-19 Accepted:2020-11-25 Online:2020-11-25 Published:2020-11-30

Abstract: At present, the neural network model plays an important role in entity relationship extraction tasks. Features can be automatically extracted by a convolutional neural network, but it is limited because a fixed window size convolution kernel in a convolutional neural network is used to extract contextual semantic information of words in a sentence. Therefore, this paper proposes a new relational extraction method fusing selfattention and convolutional neural network. The original word vector is calculated by the selfattention mechanism to obtain the relationship between the words in the sequence. The input word vector expresses richer semantic information, which can make up for the deficiency of the automatic extraction features of the convolutional neural network. The experimental results on the SemEval2010 Task 8 dataset show that, after adding the selfattention mechanism, our model is beneficial to improve the entity relationship extraction effect.


Key words: entity relationship extraction, selfattention mechanism, convolutional neural network, word vector, context semantic