• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

计算机工程与科学 ›› 2022, Vol. 44 ›› Issue (03): 454-462.

• 计算机网络与信息安全 • 上一篇    下一篇

融合双层注意力机制的属性网络节点嵌入

杨凡亿1,马慧芳1,2,闫彩瑞1,宿云1   

  1. (1.西北师范大学计算机科学与工程学院,甘肃 兰州 730070;
    2.桂林电子科技大学广西可信软件重点实验室,广西 桂林 541004)

  • 收稿日期:2020-09-25 修回日期:2020-11-18 接受日期:2022-03-25 出版日期:2022-03-25 发布日期:2022-03-24
  • 基金资助:
    国家自然科学基金(61762078,61363058,61802404);甘肃省自然科学基金(21JR7RA114);西北师范大学青年教师能力提升计划(NWNU-LKQN2019-2);广西可信软件重点实验室研究课题(kx202003)

An attributed network node embedding method combining two-level attention mechanism

YANG Fan-yi1,MA Hui-fang1,2,YAN Cai-rui1,SU Yun1   

  1. (1.College of Computer Science and Engineering,Northwest Normal University,Lanzhou 730070;
    2.Guangxi Key Laboratory of Trusted Software,Guilin University of Electronic Technology,Guilin 541004,China)
  • Received:2020-09-25 Revised:2020-11-18 Accepted:2022-03-25 Online:2022-03-25 Published:2022-03-24

摘要: 属性网络嵌入旨在学习网络中节点的低维表示,具有拓扑和属性相似的节点在嵌入空间彼此接近。注意力机制能有效学习网络中节点与其邻居的相对重要性并基于邻居重要性聚合节点表示。据此,提出一种在属性网络中融合双层注意力机制的节点嵌入算法NETA,可以有效地实现属性网络嵌入。该算法首先从拓扑结构捕获直接邻居,基于属性关系捕获间接邻居,并在此过程中考虑节点邻居的相对重要性。具体地,首先捕获节点的直接邻居和间接邻居,然后设计节点级注意力分别聚合直接邻居表示和间接邻居表示,最后设计语义级注意力对2种嵌入表示融合得到最终嵌入。在人工数据集和真实数据集上的大量实验验证了本文算法的有效性。

关键词: 节点级注意力, 语义级注意力, 属性网络, 节点嵌入

Abstract: Attributed network embedding aims to learn the low-dimensional representation of nodes for a given attributed network. Nodes with similar topology and attributes are close to each other in the embedding space. The attention mechanism can effectively learn the relative importance of nodes and their neighbors in the network, and aggregate the node representations based on the neighbor importance. According to this, a node embedding method that incorporates a two-layer attention mechanism in attributed network is proposed, which can effectively capture attributed network embedding. This method first captures direct neighbors from the topology and indirect neighbors based on attribute relationship, and effectively considers the relative importance of node neighbors in this process. Specifically, the direct neighbor and indirect neighbor of the node are first captured, and then the node-level attention mechanism is designed to aggregate the direct neighbor representation and the indirect neighbor representation respectively. Finally the semantic-level attention is designed to merge the two embedded representations to obtain the final embedding. Experiments on both real-world datasets and synthetic datasets verify the effectiveness of the proposed method.


Key words: node-level attention, semantic-level attention, attributed network, node embedding