• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

计算机工程与科学 ›› 2023, Vol. 45 ›› Issue (08): 1405-1415.

• 计算机网络与信息安全 • 上一篇    下一篇

基于注意力机制的半监督日志异常检测方法

尹春勇,冯梦雪   

  1. (南京信息工程大学计算机学院、网络空间安全学院,江苏 南京  210044)

  • 收稿日期:2023-01-30 修回日期:2023-03-23 接受日期:2023-08-25 出版日期:2023-08-25 发布日期:2023-08-18

A semi-supervised log anomaly detection method based on attention mechanism

YIN Chun-yong,FENG Meng-xue   

  1. ( School of Computer Science,Nanjing University of Information Science and Technology,Nanjing 210044,China)
  • Received:2023-01-30 Revised:2023-03-23 Accepted:2023-08-25 Online:2023-08-25 Published:2023-08-18

摘要: 日志记载着系统运行时的重要信息,通过日志异常检测可以快速准确地找出系统故障的原因。然而,日志序列存在数据不稳定和数据之间相互依赖等问题。为此,提出了一种新的半监督日志序列异常检测方法。该方法利用双向编码语义解析BERT模型和多层卷积网络分别提取日志信息,得到日志序列之间的上下文相关信息和日志序列的局部相关性,然后使用基于注意力机制的Bi-GRU网络进行日志序列异常检测。在3个数据集上验证了所提方法的性能。与6个基准方法相比,所提方法拥有最优的F1值,同时获得了最高的AUC值0.981 3。实验结果表明,所提方法可以有效处理日志序列的数据不稳定性和数据之间相互依赖的问题。

关键词: 日志异常检测, 双向门控递归单元, 多层卷积, 双向编码语义解析, 注意力机制

Abstract: Logs record important information about system operation, and log anomaly detection can quickly and accurately identify the cause of system failures. However, log sequences have problems such as data instability and interdependence between data. Therefore, a new semi-supervised log sequence anomaly detection method  is proposed. This method uses the Bidirectional Encoder Representations from Transformers (BERT) model and multi-layer convolutional network to extract log information, obtain the contextual relevance between log sequences and the local relevance of log sequences. Finally, the attention-based Bi-GRU network is used for log sequence anomaly detection. The performance of this model was verified on three datasets. Compared with six benchmark models, this model has the best F1 value and the highest AUC value (0.981 3), and the experimental results show that it can effectively handle the problems of data instability and interdependence between data in log sequences.

Key words: log anomaly detection, bidirectional gate recurrent unit, multilayer convolution, bidirectional encoder representation from transformers, attention mechanism

中图分类号: