A new lifelong learning method based on dual distillation for bearing diagnosis with incremental fault types

IF 9.9 1区 工程技术 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Advanced Engineering Informatics Pub Date : 2025-05-01 Epub Date: 2025-01-22 DOI:10.1016/j.aei.2025.103136
Shijun Xie , Changqing Shen , Dong Wang , Juanjuan Shi , Weiguo Huang , Zhongkui Zhu
{"title":"A new lifelong learning method based on dual distillation for bearing diagnosis with incremental fault types","authors":"Shijun Xie ,&nbsp;Changqing Shen ,&nbsp;Dong Wang ,&nbsp;Juanjuan Shi ,&nbsp;Weiguo Huang ,&nbsp;Zhongkui Zhu","doi":"10.1016/j.aei.2025.103136","DOIUrl":null,"url":null,"abstract":"<div><div>In the rapidly evolving industrial environment, bearings may develop new fault types, posing significant challenges to deep learning-based intelligent fault diagnosis models. These models often suffer from catastrophic forgetting when encountering unknown fault types, resulting in performance degradation. Lifelong learning strategies offer a solution by enabling models to retain old knowledge while acquiring new information. However, traditional replay-based lifelong learning methods typically involve risks of privacy leakage and escalating storage costs. To address these issues, this study proposes a novel lifelong learning method called lifelong learning based on dual distillation (LLDD), which integrates a dual-distillation mechanism comprising dataset distillation and feature distillation, and introduces an equiangular basis vector (EBV) classifier. The dataset distillation technique compresses the dataset of each task into a small number of synthetic data that capture the essential information of the task, serving as replay exemplars. This approach reduces reliance on original data and storage costs. Feature distillation ensures that the model’s representations do not deviate significantly from previous ones. The proposed method effectively prevents an increase in the number of model parameters during the lifelong learning process by incorporating the EBV classifier, thereby maintaining model complexity stability. The performance of LLDD is validated on two bearing diagnosis cases with incremental fault types. Results demonstrate that the proposed method surpasses other lifelong learning methods in performance and memory efficiency.</div></div>","PeriodicalId":50941,"journal":{"name":"Advanced Engineering Informatics","volume":"65 ","pages":"Article 103136"},"PeriodicalIF":9.9000,"publicationDate":"2025-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advanced Engineering Informatics","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1474034625000291","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/22 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

In the rapidly evolving industrial environment, bearings may develop new fault types, posing significant challenges to deep learning-based intelligent fault diagnosis models. These models often suffer from catastrophic forgetting when encountering unknown fault types, resulting in performance degradation. Lifelong learning strategies offer a solution by enabling models to retain old knowledge while acquiring new information. However, traditional replay-based lifelong learning methods typically involve risks of privacy leakage and escalating storage costs. To address these issues, this study proposes a novel lifelong learning method called lifelong learning based on dual distillation (LLDD), which integrates a dual-distillation mechanism comprising dataset distillation and feature distillation, and introduces an equiangular basis vector (EBV) classifier. The dataset distillation technique compresses the dataset of each task into a small number of synthetic data that capture the essential information of the task, serving as replay exemplars. This approach reduces reliance on original data and storage costs. Feature distillation ensures that the model’s representations do not deviate significantly from previous ones. The proposed method effectively prevents an increase in the number of model parameters during the lifelong learning process by incorporating the EBV classifier, thereby maintaining model complexity stability. The performance of LLDD is validated on two bearing diagnosis cases with incremental fault types. Results demonstrate that the proposed method surpasses other lifelong learning methods in performance and memory efficiency.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于双精馏的轴承增量故障诊断终身学习方法
在快速发展的工业环境中,轴承可能会发展出新的故障类型,这对基于深度学习的智能故障诊断模型提出了重大挑战。当遇到未知的故障类型时,这些模型经常遭受灾难性的遗忘,从而导致性能下降。终身学习策略通过使模型在获取新信息的同时保留旧知识提供了一种解决方案。然而,传统的基于重播的终身学习方法通常存在隐私泄露和存储成本上升的风险。为了解决这些问题,本研究提出了一种基于双蒸馏的终身学习方法(LLDD),该方法集成了数据集蒸馏和特征蒸馏的双蒸馏机制,并引入了等角基向量(EBV)分类器。数据集蒸馏技术将每个任务的数据集压缩成少量的合成数据,这些合成数据捕获了任务的基本信息,作为重放示例。这种方法减少了对原始数据和存储成本的依赖。特征蒸馏确保模型的表示不会明显偏离之前的表示。该方法通过引入EBV分类器,有效地防止了终身学习过程中模型参数数量的增加,从而保持了模型复杂度的稳定性。通过两个故障类型增加的轴承诊断案例验证了LLDD的性能。结果表明,该方法在性能和记忆效率方面优于其他终身学习方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Advanced Engineering Informatics
Advanced Engineering Informatics 工程技术-工程:综合
CiteScore
12.40
自引率
18.20%
发文量
292
审稿时长
45 days
期刊介绍: Advanced Engineering Informatics is an international Journal that solicits research papers with an emphasis on 'knowledge' and 'engineering applications'. The Journal seeks original papers that report progress in applying methods of engineering informatics. These papers should have engineering relevance and help provide a scientific base for more reliable, spontaneous, and creative engineering decision-making. Additionally, papers should demonstrate the science of supporting knowledge-intensive engineering tasks and validate the generality, power, and scalability of new methods through rigorous evaluation, preferably both qualitatively and quantitatively. Abstracting and indexing for Advanced Engineering Informatics include Science Citation Index Expanded, Scopus and INSPEC.
期刊最新文献
Synergistic in-domain and out-of-domain learning to strengthen visual scene understanding in data-scarce, imbalanced construction settings Span entropy: A novel time series complexity measurement with a redesigned phase space reconstruction Collaborative planning model for mixed traffic flow in bottleneck zones considering compliance and the impact of human-driven vehicles A method for safety risk dynamic assessment in flight cockpit intelligent human-machine interaction Multi-objective differential evolution algorithm based on partial reinforcement learning intelligence for engineering design problems and physics-informed neural networks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1