• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2025, Vol. 47 ›› Issue (4): 728-739.

• Artificial Intelligence and Data Mining • Previous Articles     Next Articles

Research on dynamic graph generation model based on deep adversarial network

ZHANG Mengyuan1,DUAN Yang2,WANG Binbin1,ZHANG Lei1,WU Yi1,LIU Chang1,GUO Naiwang1,CHENG Dawei2   

  1. (1.State Grid Shanghai Municipal Electric Power Company,Shanghai 200122;
    2.School of Computer Science and Technology,Tongji University,Shanghai 201804,China)
  • Received:2024-07-08 Revised:2024-08-26 Online:2025-04-25 Published:2025-04-17

Abstract: In recent years, the problem of graph generation has received widespread attention. By learning the distribution of real graphs, graph generation techniques can generate synthetic graphs with similar characteristics, which are widely used in various fields such as e-commerce and power networks. In practical applications, most graphs are dynamic, with their topological structures changing over time. However, existing graph generators are primarily designed for static graphs, neglecting the temporal characteristics of graphs. Additionally, current dynamic graph generation models generally suffer from long training times, making it difficult to handle large-scale dynamic graphs.  To address these issues, a novel GAN-based model, called dynamic graph generative adversarial network (DGGAN), is proposed. The models encoder employs a graph self-attention mechanism for parallel computation, thereby enhancing model training efficiency. A gating mechanism is used to control information flow, helping the model learn and memorize key information more effectively. Comprehensive experimental evaluations of DGGAN and representative graph generation methods were conducted on six dynamic graph datasets. The experimental results demonstrate that DGGAN outperforms existing models in terms of generated graphs quality and efficiency.

Key words: graph generation, dynamic graph, generative adversarial network, graph self-attention machanism