Capsule Network Based on Double-layer Attention Mechanism and Multi-scale Feature Extraction for Remaining Life Prediction

IF 2.6 4区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Neural Processing Letters Pub Date : 2024-06-03 DOI:10.1007/s11063-024-11651-8
Zhiwu Shang, Zehua Feng, Wanxiang Li, Zhihua Wu, Hongchuan Cheng
{"title":"Capsule Network Based on Double-layer Attention Mechanism and Multi-scale Feature Extraction for Remaining Life Prediction","authors":"Zhiwu Shang, Zehua Feng, Wanxiang Li, Zhihua Wu, Hongchuan Cheng","doi":"10.1007/s11063-024-11651-8","DOIUrl":null,"url":null,"abstract":"<p>The era of big data provides a platform for high-precision RUL prediction, but the existing RUL prediction methods, which effectively extract key degradation information, remain a challenge. Existing methods ignore the influence of sensor and degradation moment variability, and instead assign weights to them equally, which affects the final prediction accuracy. In addition, convolutional networks lose key information due to downsampling operations and also suffer from the drawback of insufficient feature extraction capability. To address these issues, the two-layer attention mechanism and the Inception module are embedded in the capsule structure (mai-capsule model) for lifetime prediction. The first layer of the channel attention mechanism (CAM) evaluates the influence of various sensor information on the forecast; the second layer adds a time-step attention (TSAM) mechanism to the LSTM network to weigh the contribution of different moments of the engine's whole life cycle to the prediction, while weakening the influence of environmental noise on the prediction. The Inception module is introduced to perform multi-scale feature extraction on the weighted data to capture the degradation information to the maximum extent. Lastly, we are inspired to employ the capsule network to capture important position information of high and low-dimensional features, given its capacity to facilitate a more effective rendition of the overall features of the time-series data. The efficacy of the suggested model is assessed against other approaches and verified using the publicly accessible C-MPASS dataset. The end findings demonstrate the excellent prediction precision of the suggested approach.</p>","PeriodicalId":51144,"journal":{"name":"Neural Processing Letters","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Processing Letters","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s11063-024-11651-8","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

The era of big data provides a platform for high-precision RUL prediction, but the existing RUL prediction methods, which effectively extract key degradation information, remain a challenge. Existing methods ignore the influence of sensor and degradation moment variability, and instead assign weights to them equally, which affects the final prediction accuracy. In addition, convolutional networks lose key information due to downsampling operations and also suffer from the drawback of insufficient feature extraction capability. To address these issues, the two-layer attention mechanism and the Inception module are embedded in the capsule structure (mai-capsule model) for lifetime prediction. The first layer of the channel attention mechanism (CAM) evaluates the influence of various sensor information on the forecast; the second layer adds a time-step attention (TSAM) mechanism to the LSTM network to weigh the contribution of different moments of the engine's whole life cycle to the prediction, while weakening the influence of environmental noise on the prediction. The Inception module is introduced to perform multi-scale feature extraction on the weighted data to capture the degradation information to the maximum extent. Lastly, we are inspired to employ the capsule network to capture important position information of high and low-dimensional features, given its capacity to facilitate a more effective rendition of the overall features of the time-series data. The efficacy of the suggested model is assessed against other approaches and verified using the publicly accessible C-MPASS dataset. The end findings demonstrate the excellent prediction precision of the suggested approach.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于双层注意机制和多尺度特征提取的胶囊网络用于剩余寿命预测
大数据时代为高精度 RUL 预测提供了平台,但现有的 RUL 预测方法如何有效提取关键退化信息仍是一个挑战。现有方法忽略了传感器和退化时刻变化的影响,而是对它们平均分配权重,这影响了最终的预测精度。此外,卷积网络会因为降采样操作而丢失关键信息,而且还存在特征提取能力不足的缺点。为了解决这些问题,在胶囊结构(mai-capsule 模型)中嵌入了双层注意机制和 Inception 模块,用于寿命预测。第一层通道注意机制(CAM)评估各种传感器信息对预测的影响;第二层在 LSTM 网络中增加了时间步长注意机制(TSAM),以权衡发动机整个生命周期中不同时刻对预测的贡献,同时削弱环境噪声对预测的影响。我们还引入了 Inception 模块,对加权数据进行多尺度特征提取,以最大限度地捕捉退化信息。最后,考虑到胶囊网络能够更有效地呈现时间序列数据的整体特征,我们受到启发,采用胶囊网络捕捉高维和低维特征的重要位置信息。我们利用公开的 C-MPASS 数据集对所建议模型的功效进行了评估,并与其他方法进行了比较和验证。最终结果表明,所建议的方法具有出色的预测精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Neural Processing Letters
Neural Processing Letters 工程技术-计算机:人工智能
CiteScore
4.90
自引率
12.90%
发文量
392
审稿时长
2.8 months
期刊介绍: Neural Processing Letters is an international journal publishing research results and innovative ideas on all aspects of artificial neural networks. Coverage includes theoretical developments, biological models, new formal modes, learning, applications, software and hardware developments, and prospective researches. The journal promotes fast exchange of information in the community of neural network researchers and users. The resurgence of interest in the field of artificial neural networks since the beginning of the 1980s is coupled to tremendous research activity in specialized or multidisciplinary groups. Research, however, is not possible without good communication between people and the exchange of information, especially in a field covering such different areas; fast communication is also a key aspect, and this is the reason for Neural Processing Letters
期刊最新文献
Label-Only Membership Inference Attack Based on Model Explanation A Robot Ground Medium Classification Algorithm Based on Feature Fusion and Adaptive Spatio-Temporal Cascade Networks A Deep Learning-Based Hybrid CNN-LSTM Model for Location-Aware Web Service Recommendation A Clustering Pruning Method Based on Multidimensional Channel Information A Neural Network-Based Poisson Solver for Fluid Simulation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1