• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

计算机工程与科学

• • 上一篇    下一篇

多尺度信息融合与分层注意力聚合的子图联邦学习方法

王若禹,丁世飞,郭丽丽   

  1. (1.中国矿业大学计算机科学与技术学院,徐州 221116;
    2.矿山数字化教育部工程研究中心(中国矿业大学),徐州 221116)

Multi-scale information fusion and layered attention aggregation for subgraph federated learning

WANG Ruoyu,DING Shifei,GUO Lili   

  1. (1.School of Computer Science and Technology, China University of Mining Technology, Xuzhou 221116, China;
     2.Mine Digitization Engineering Research Center, Ministry of Education (China University of Mining and Technology), Xuzhou 221116, China)

摘要: 子图联邦学习通过图卷积网络在本地客户端处理全局图的子图,并通过服务器更新这些参数,从而保护用户隐私。现有方法缺乏对某些任务或图结构中重要节点的关注,这可能降低节点嵌入的效率。该研究提出了一个图联邦学习框架FedMFG,引入了多尺度信息融合卷积将节点特征与邻居信息整合,从而提升节点特征表示能力。该框架在预训练阶段传递参数以减少通信成本,并在服务器端应用注意力机制动态调整权重,以实现全局参数的更好聚合。在标准参考数据集上的实验结果表明,FedMFG相较于先前研究具有更高的准确性、稳定性和更低的通信成本。


关键词: 子图联邦学习, 多尺度信息融合卷积, 图卷积网络, 注意力机制

Abstract:  Subgraph federated learning processes subgraphs of the global graph locally using a graph convolutional network and updates these parameters via a server, enabling the protection of user privacy. Existing approaches lack attention to nodes that are more important for certain tasks or graph structures, which may reduce the efficiency of node embedding. This research proposes FedMFG, a framework for graph federation learning, which introduces multi-scale fusion convolution to integrate node features with neighbor information, thus enhancing node feature representation. The framework passes parameters in pre-training rounds to reduce the communication cost and applies an attention mechanism on the server side to dynamically adjust the weights for better aggregation of global parameters. Experimental results on a standard reference dataset show that FedMFG has higher accuracy, stability, and lower communication cost than previous studies.

Key words: subgraph federated learning, multi-scale information fusion convolution, GCN, attention mechanism