• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2023, Vol. 45 ›› Issue (05): 895-902.

• Artificial Intelligence and Data Mining • Previous Articles     Next Articles

An entity relation extraction method based on deep learning

Peride Abdurehim1,2,Turdi Tohti1,2,Askar Hamdulla1,2   

  1. (1.School of Information Science and Engineering,Xinjiang University,Urumqi 830017;
    2.Xinjiang Key Laboratory of Signal Detection and Processing,Urumqi 830017,China)saic vector
  • Received:2021-05-06 Revised:2022-01-08 Accepted:2023-05-25 Online:2023-05-25 Published:2023-05-16

Abstract: Commonly used neural networks such as Convolutional Neural Network (CNN) and Recurrent Neural Network (RNN) have shown very good results in relation extraction tasks. However, CNN is good at capturing local features, but it is not suitable for processing sequence features. Traditional RNN can effectively extract features between long-distance words, but it is easy to cause gradient disappearance or gradient explosion. To solve these problems, a hybrid neural network model, called BiLSTM-CNN-Attention, is proposed. The combination of BiLSTM and CNN makes them complement each other, and the introduction of Attention can highlight the importance of inter entity relation words in the whole sentence. In addition, the mosaic word vector is used in the word embedding layer to overcome the problem of single word vector representation. The experimental results show that, compared with word2vec word vector, mosaic word vector can obtain more semantic word vector and enhance the robustness of word vector.  Compared with BiLSTM-CNN, CNN-Attention and BiLSTM-Attention models, BiLSTM-CNN-Attention improves the accuracy and F1 value.

Key words: relation extraction, convolutional neural network, recurrent neural network, attention mechanism, hybrid model, mo