系统仿真学报 ›› 2022, Vol. 34 ›› Issue (7): 1593-1604.doi: 10.16182/j.issn1004731x.joss.21-0182

• 仿真建模理论与方法 • 上一篇    下一篇

基于多尺度LSTM预测模型研究

邱俊杰(), 郑红(), 程云辉   

  1. 华东理工大学 信息科学与工程学院,上海 200237
  • 收稿日期:2021-03-08 修回日期:2021-06-24 出版日期:2022-07-30 发布日期:2022-07-20
  • 通讯作者: 郑红 E-mail:15995025072@163.com;zhenghong@ecust.edu.cn
  • 作者简介:邱俊杰(1996-),男,硕士生,研究方向为推荐系统。E-mail:15995025072@163.com
  • 基金资助:
    上海市信息化发展(大数据发展)专项资金(201901043)

Research on Prediction of Model Based on Multi-scale LSTM

Junjie Qiu(), Hong Zheng(), Yunhui Cheng   

  1. School of Information Science and Engineering, East Chines of Science and Technology, Shanghai 200237, China
  • Received:2021-03-08 Revised:2021-06-24 Online:2022-07-30 Published:2022-07-20
  • Contact: Hong Zheng E-mail:15995025072@163.com;zhenghong@ecust.edu.cn

摘要:

航空发动机剩余寿命(remaining useful life,RUL)预测是设备故障预测与健康管理(prognostics and health management,PHM)的核心问题。针对发动机数据维度高、滞后性强和复杂度高等挑战,提出了一种基于自训练权重的多尺度注意力双向长短期记忆神经网络模型。通过不同尺度的双向长短期记忆神经网络(bidirectional long short-term memory neural network,BiLSTM)提取多尺度特征;提出一种基于自训练权重的融合算法,通过引入注意力机制进行不同尺度的特征筛选,以提高预测精度。将各模型在NASA的C-MAPSS数据集上进行实验对比,结果证明,所提出预测模型在准确率和均方根误差指标上均有所提升。

关键词: 故障预测与健康管理, 剩余寿命, 双向长短期记忆网络, 自训练权重, 注意力机制, 融合算法

Abstract:

Aircraft engine remaining useful life (RUL) prediction is the core issue in equipmentfailure prognostics and health management (PHM). Aiming at the characteristics of high dimensionality, high lag and complexity of engine data, a multi-scale attention-based bidirectional long short-term memory neural network model based on self-training weights is proposed. Multi-scale features are extracted through bidirectional long short-term memory neural network (BiLSTM) of different scales. A fusion algorithm based on self-training weights is proposed, and an attention mechanism is introduced to screen features at different scales to improve prediction accuracy. Various models are compared on the NASA's C-MAPSS data set. The results prove that the proposed prediction model improves in both accuracy and root mean square error indicators.

Key words: PHM, RUL, BiLSTM, self-training weights, attention mechanism, fusion algorithm

中图分类号: