搜索

x

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于鲁棒极端学习机的混沌时间序列建模预测

沈力华 陈吉红 曾志刚 金健

引用本文:
Citation:

基于鲁棒极端学习机的混沌时间序列建模预测

沈力华, 陈吉红, 曾志刚, 金健

Chaotic time series prediction based on robust extreme learning machine

Shen Li-Hua, Chen Ji-Hong, Zeng Zhi-Gang, Jin Jian
PDF
导出引用
  • 针对混沌时间序列预测模型易受异常点影响,导致模型预测精度低的问题,在贝叶斯框架下提出一种鲁棒极端学习机.所提模型将具有重尾分布特性的高斯混合分布作为模型输出似然函数,得到一种对异常点和噪声更具鲁棒性的预测模型.但由于将高斯混合分布作为模型输出似然函数后,模型输出的边缘似然函数变成难以解析处理的形式,因此引入变分方法进行近似推理,实现模型参数的估计.在加入异常点和噪声的情况下,将所提模型应用于大气环流模拟模型方程Lorenz序列以及Rossler混沌时间序列和太阳黑子混沌时间序列的预测中,预测结果验证了所提模型的有效性.
    Chaos is seemingly irregular and analogous to random movement happening in a determinative system in nature,and more and more types and numbers of time series with chaotic characteristics are obtained from the actual systems,such as atmospheric circulation,temperature,rainfall,sunspots,and the Yellow River flow.The chaotic time series prediction has become a research hotspot in recent years.Because neural network can be strongly approximated nonlinearly,it has better prediction performance in the chaotic time series modeling.Extreme learning machine is a kind of neural network, and it is widely used due to its simple structure,high learning efficiency and having global optimal solution.Extreme learning machine initializes the input weight randomly and just adjusts the output weight in the training process,in order to be able to obtain the global optimal solution,so it has faster convergence speed and can overcome the disadvantage of gradient vanishing.Due to the above advantages,in recent years,the improved algorithms of the extreme learning machine have been developed rapidly.However,the traditional training methods of extreme learning machine have very poor robustness and can be affected easily by noise and outliers.And in practical applications,the time series are often contaminated by noise and outliers,so it is important to improve the forecasting model robustness and reduce the influence of noise and abnormal points to obtain better prediction accuracy.In this paper,a robust extreme learning machine is proposed in a Bayesian framework to solve the problem that outliers exist in the training data set.Firstly,the input samples are mapped onto the high-dimensional space,and the output weight of the extreme learning machine is used as the parameter to be estimated,then the proposed model utilizes the more robust Gaussian mixture distribution as the likelihood function of the model output.The marginal likelihood of the model output is analytically intractable for the Gaussian mixture distribution,so a variational procedure is introduced to realize the parameter estimation.In the cases of different noise levels and the different numbers of outliers,the proposed model is compared with the other prediction models.The experimental results of Lorenz,Rossler and Sunspot-Runoff in the Yellow River time series with outliers and noise demonstrate that the proposed robust extreme learning machine model could obtain a better prediction accuracy.The proposed robust extreme learning machine not only has the strong capability of the nonlinear approximation but also can learn the model parameters automatically and has strong robustness.At the same time,the time complexities of different models are compared and the convergence of the proposed model is analyzed at the end of the paper.
      通信作者: 金健, D201477195@hust.edu.cn
    • 基金项目: 国家自然科学基金(批准号:51575210)和国家科技重大专项(批准号:2014ZX04001051)资助的课题.
      Corresponding author: Jin Jian, D201477195@hust.edu.cn
    • Funds: Project supported by the National Natural Science Foundation of China (Grant No. 51575210) and the National Science and Technology Major Project of the Ministry of Science and Technology of China (Grant No. 2014ZX04001051).
    [1]

    Xiu C B, Xu M 2010 Acta Phys. Sin. 59 7650 (in Chinese) [修春波, 徐勐 2010 59 7650]

    [2]

    Han M, Xu M L 2013 Acta Phys. Sin. 62 120510 (in Chinese) [韩敏, 许美玲 2013 62 120510]

    [3]

    Zhang J S, Xiao X C 2000 Acta Phys. Sin. 49 403 (in Chinese) [张家树, 肖先赐 2000 49 403]

    [4]

    Li D C, Han M, Wang J 2012 IEEE Trans. Neural Netw. Learn. Syst. 23 787

    [5]

    Wang X Y, Han M 2015 Acta Phys. Sin. 64 070504 (in Chinese) [王新迎, 韩敏 2015 64 070504]

    [6]

    Li R G, Zhang H L, Fan W H, Wang Y 2015 Acta Phys. Sin. 64 200506 (in Chinese) [李瑞国, 张宏立, 范文慧, 王雅 2015 64 200506]

    [7]

    Chandra R, Ong Y S, Goh C K 2017 Neurocomputing 243 21

    [8]

    Politi A 2017 Phys. Rev. Lett. 118 144101

    [9]

    Ye B, Chen J, Ju C 2017 Comput. Nonlin. Scien. Num. Simul. 44 284

    [10]

    Koskela T, Lehtokangas M, Saarinen J, Kask K 1996 Proceedings of the World Congress on Neural Networks (San Diego: INNS Press) p491

    [11]

    Jaeger H, Haas H 2004 Science 304 78

    [12]

    Dutoit X, Schrauwen B, van Campenhout J 2009 Neurocomputing 72 1534

    [13]

    Ma Q L, Zheng Q L, Peng H, Tan J W 2009 Acta Phys. Sin. 58 1410 (in Chinese) [马千里, 郑启伦, 彭宏, 覃姜维 2009 58 1410]

    [14]

    Huang G B, Zhu Q Y, Siew C K 2006 Neurocomputing 70 489

    [15]

    Soria-Olivas E, Gomez-Sanchis J, Martin J D 2011 IEEE Trans. Neural Netw. 22 505

    [16]

    Huang G B, Wang D H, Lan Y 2011 Int. J. Mach. Learn. Cybern. 2 107

    [17]

    Han M, Xi J, Xu S 2004 IEEE Trans. Sig. Proc. 52 3409

    [18]

    Liu X, Wang L, Huang G B 2015 Neurocomputing 149 253

    [19]

    Lu H, Du B, Liu J 2017 Memet. Comput. 9 121

    [20]

    Wang X, Han M 2015 Engin. Appl. Artif. Intell. 40 28

    [21]

    Tang J, Deng C, Huang G B 2016 IEEE Trans. Neural Netw. Learn. Syst. 27 809

    [22]

    Huang G B, Zhou H, Ding X 2012 IEEE Trans. Syst. Man Cybern. B 42 513

    [23]

    Tipping M E, Lawrence N D 2005 Neurocomputing 69 123

    [24]

    Tipping M E 2001 J. Mach. Learn. Res. 1 211

    [25]

    Faul A C, Tipping M E 2001 International Conference on Artificial Neural Networks Vienna, Austria, August 21-25, 2001 p95

    [26]

    Wang B, Titterington D M 2006 Bayes. Analys. 1 625

  • [1]

    Xiu C B, Xu M 2010 Acta Phys. Sin. 59 7650 (in Chinese) [修春波, 徐勐 2010 59 7650]

    [2]

    Han M, Xu M L 2013 Acta Phys. Sin. 62 120510 (in Chinese) [韩敏, 许美玲 2013 62 120510]

    [3]

    Zhang J S, Xiao X C 2000 Acta Phys. Sin. 49 403 (in Chinese) [张家树, 肖先赐 2000 49 403]

    [4]

    Li D C, Han M, Wang J 2012 IEEE Trans. Neural Netw. Learn. Syst. 23 787

    [5]

    Wang X Y, Han M 2015 Acta Phys. Sin. 64 070504 (in Chinese) [王新迎, 韩敏 2015 64 070504]

    [6]

    Li R G, Zhang H L, Fan W H, Wang Y 2015 Acta Phys. Sin. 64 200506 (in Chinese) [李瑞国, 张宏立, 范文慧, 王雅 2015 64 200506]

    [7]

    Chandra R, Ong Y S, Goh C K 2017 Neurocomputing 243 21

    [8]

    Politi A 2017 Phys. Rev. Lett. 118 144101

    [9]

    Ye B, Chen J, Ju C 2017 Comput. Nonlin. Scien. Num. Simul. 44 284

    [10]

    Koskela T, Lehtokangas M, Saarinen J, Kask K 1996 Proceedings of the World Congress on Neural Networks (San Diego: INNS Press) p491

    [11]

    Jaeger H, Haas H 2004 Science 304 78

    [12]

    Dutoit X, Schrauwen B, van Campenhout J 2009 Neurocomputing 72 1534

    [13]

    Ma Q L, Zheng Q L, Peng H, Tan J W 2009 Acta Phys. Sin. 58 1410 (in Chinese) [马千里, 郑启伦, 彭宏, 覃姜维 2009 58 1410]

    [14]

    Huang G B, Zhu Q Y, Siew C K 2006 Neurocomputing 70 489

    [15]

    Soria-Olivas E, Gomez-Sanchis J, Martin J D 2011 IEEE Trans. Neural Netw. 22 505

    [16]

    Huang G B, Wang D H, Lan Y 2011 Int. J. Mach. Learn. Cybern. 2 107

    [17]

    Han M, Xi J, Xu S 2004 IEEE Trans. Sig. Proc. 52 3409

    [18]

    Liu X, Wang L, Huang G B 2015 Neurocomputing 149 253

    [19]

    Lu H, Du B, Liu J 2017 Memet. Comput. 9 121

    [20]

    Wang X, Han M 2015 Engin. Appl. Artif. Intell. 40 28

    [21]

    Tang J, Deng C, Huang G B 2016 IEEE Trans. Neural Netw. Learn. Syst. 27 809

    [22]

    Huang G B, Zhou H, Ding X 2012 IEEE Trans. Syst. Man Cybern. B 42 513

    [23]

    Tipping M E, Lawrence N D 2005 Neurocomputing 69 123

    [24]

    Tipping M E 2001 J. Mach. Learn. Res. 1 211

    [25]

    Faul A C, Tipping M E 2001 International Conference on Artificial Neural Networks Vienna, Austria, August 21-25, 2001 p95

    [26]

    Wang B, Titterington D M 2006 Bayes. Analys. 1 625

  • [1] 梅英, 谭冠政, 刘振焘, 武鹤. 基于大脑情感学习模型和自适应遗传算法的混沌时间序列预测.  , 2018, 67(8): 080502. doi: 10.7498/aps.67.20172104
    [2] 路永坤. 参数不确定统一混沌系统的鲁棒分数阶比例-微分控制.  , 2015, 64(5): 050503. doi: 10.7498/aps.64.050503
    [3] 李瑞国, 张宏立, 范文慧, 王雅. 基于改进教学优化算法的Hermite正交基神经网络混沌时间序列预测.  , 2015, 64(20): 200506. doi: 10.7498/aps.64.200506
    [4] 王新迎, 韩敏. 多元混沌时间序列的多核极端学习机建模预测.  , 2015, 64(7): 070504. doi: 10.7498/aps.64.070504
    [5] 赵永平, 王康康. 具有增加删除机制的正则化极端学习机的混沌时间序列预测.  , 2013, 62(24): 240509. doi: 10.7498/aps.62.240509
    [6] 张文专, 龙文, 焦建军. 基于差分进化算法的混沌时间序列预测模型参数组合优化.  , 2012, 61(22): 220506. doi: 10.7498/aps.61.220506
    [7] 王新迎, 韩敏. 基于极端学习机的多变量混沌时间序列预测.  , 2012, 61(8): 080507. doi: 10.7498/aps.61.080507
    [8] 李军, 张友鹏. 基于高斯过程的混沌时间序列单步与多步预测.  , 2011, 60(7): 070513. doi: 10.7498/aps.60.070513
    [9] 张弦, 王宏力. 基于Cholesky分解的增量式RELM及其在时间序列预测中的应用.  , 2011, 60(11): 110201. doi: 10.7498/aps.60.110201
    [10] 张弦, 王宏力. 具有选择与遗忘机制的极端学习机在时间序列预测中的应用.  , 2011, 60(8): 080504. doi: 10.7498/aps.60.080504
    [11] 张春涛, 马千里, 彭宏. 基于信息熵优化相空间重构参数的混沌时间序列预测.  , 2010, 59(11): 7623-7629. doi: 10.7498/aps.59.7623
    [12] 毛剑琴, 丁海山, 姚健. 基于模糊树模型的混沌时间序列预测.  , 2009, 58(4): 2220-2230. doi: 10.7498/aps.58.2220
    [13] 马千里, 郑启伦, 彭宏, 覃姜维. 基于模糊边界模块化神经网络的混沌时间序列预测.  , 2009, 58(3): 1410-1419. doi: 10.7498/aps.58.1410
    [14] 刘福才, 张彦柳, 陈 超. 基于鲁棒模糊聚类的混沌时间序列预测.  , 2008, 57(5): 2784-2790. doi: 10.7498/aps.57.2784
    [15] 张军峰, 胡寿松. 基于一种新型聚类算法的RBF神经网络混沌时间序列预测.  , 2007, 56(2): 713-719. doi: 10.7498/aps.56.713
    [16] 贺 涛, 周正欧. 基于分形自仿射的混沌时间序列预测.  , 2007, 56(2): 693-700. doi: 10.7498/aps.56.693
    [17] 崔万照, 朱长纯, 保文星, 刘君华. 基于模糊模型支持向量机的混沌时间序列预测.  , 2005, 54(7): 3009-3018. doi: 10.7498/aps.54.3009
    [18] 李 军, 刘君华. 一种新型广义RBF神经网络在混沌时间序列预测中的研究.  , 2005, 54(10): 4569-4577. doi: 10.7498/aps.54.4569
    [19] 叶美盈, 汪晓东, 张浩然. 基于在线最小二乘支持向量机回归的混沌时间序列预测.  , 2005, 54(6): 2568-2573. doi: 10.7498/aps.54.2568
    [20] 崔万照, 朱长纯, 保文星, 刘君华. 混沌时间序列的支持向量机预测.  , 2004, 53(10): 3303-3310. doi: 10.7498/aps.53.3303
计量
  • 文章访问数:  6619
  • PDF下载量:  299
  • 被引次数: 0
出版历程
  • 收稿日期:  2017-08-22
  • 修回日期:  2017-10-24
  • 刊出日期:  2018-02-05

/

返回文章
返回
Baidu
map