• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

J4 ›› 2013, Vol. 35 ›› Issue (8): 96-102.

• 论文 • 上一篇    下一篇

Adaboost算法改进BP神经网络预测研究

李翔,朱全银   

  1. (淮阴工学院计算机工程学院,江苏 淮安 223003)
  • 收稿日期:2012-06-18 修回日期:2012-10-22 出版日期:2013-08-25 发布日期:2013-08-25
  • 基金资助:

    国家星火计划资助项目(2011GA690190);江苏省属高校自然科学重大基础研究资助项目(11KJA460001)

Prediction of improved BP neural network
 by Adaboost algorithm     

LI Xiang,ZHU Quanyin   

  1. (Faculty of Computer Engineering,Huaiyin Institute of Technology,Huai’an 223003,China)
  • Received:2012-06-18 Revised:2012-10-22 Online:2013-08-25 Published:2013-08-25

摘要:

针对传统BP神经网络容易陷入局部极小、预测精度低的问题,提出使用Adaboost算法和BP神经网络相结合的方法,提高网络预测精度和泛化能力。该方法首先对样本数据进行预处理并初始化测试数据分布权值;然后通过选取不同的隐含层节点数、节点传递函数、训练函数、网络学习函数构造出不同类型的BP弱预测器并对样本数据进行反复训练;最后使用Adaboost算法将得到的多个BP神经网络弱预测器组成新的强预测器。对UCI数据库中数据集进行仿真实验,结果表明本方法比传统BP网络预测平均误差绝对值减少近50%,提高了网络预测精度,为神经网络预测提供借鉴。

关键词: 神经网络, 强预测器, 迭代算法, adaboost算法

Abstract:

The traditional BP (Back Propagation) neural network is easy to fall into local minimum and has lower accuracy. According to this problem, a method that combines the Adaboost algorithm and BP neural network is proposed to improve the prediction accuracy and generalization ability of the neural network. Firstly, the method preprocesses the historical data and initializes the distribution weights of test data. Secondly, it selects different hidden layer nodes, node transfer functions, training functions, and network learning functions to construct weak predictors of BP neural network and trains the sample data repeatedly. Finally, it made more weak predictors of BP neural network to form a new strong predictor by Adaboost algorithm. The database of UCI (University of California Irvine) is used in experiments. The results show that this method can reduce nearly 50% for the mean error absolute value compared to the traditional BP network, and improve the prediction accuracy of network. So this method provides references for the neural network prediction.

Key words: neural network;strong predictor;iterative algorithm;adaboost algorithm