基于改进概率知识精馏的轴承故障智能诊断

Ziqian Shen, Wei Guo
{"title":"基于改进概率知识精馏的轴承故障智能诊断","authors":"Ziqian Shen, Wei Guo","doi":"10.1109/PHM-Nanjing52125.2021.9612949","DOIUrl":null,"url":null,"abstract":"Knowledge distillation (KD) is one of popular algorithms for compressing deep neural networks because it generates a compact but still powerful deep neural network for the cases of complicated situations and limited computation resources. In this study, an intelligent fault diagnosis method is developed based on the probabilistic knowledge distillation (PKD) and deep convolutional neural network (CNN) to determine the health states of bearings. First, the one-dimensional vibration signal is reshaped as a two-dimensional matrix to input the teacher or student network. Then, a deeper neural network and small network are trained as the teacher and student networks, respectively. The probability distribution (PD) is learned by minimizing the difference of the joint density probability estimation between the teacher and student networks, that is, the lightweight network learns to integrate the PD of the deeper neural network in the high-dimensional feature space and realizes the knowledge transfer from training samples to test samples. The results of experimental bearings indicate that the proposed diagnosis method has higher diagnosis accuracy than the other two popular knowledge distillation methods and its student network only has about one 700-th parameter of the teacher network. Therefore, the proposed method achieves a good balance between the classification accuracy and network compression, and demonstrates potential application to intelligent fault diagnosis of bearings under varying working conditions.","PeriodicalId":436428,"journal":{"name":"2021 Global Reliability and Prognostics and Health Management (PHM-Nanjing)","volume":"183 5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"An Intelligent Bearing Fault Diagnosis based on Modified Probabilistic Knowledge Distillation\",\"authors\":\"Ziqian Shen, Wei Guo\",\"doi\":\"10.1109/PHM-Nanjing52125.2021.9612949\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Knowledge distillation (KD) is one of popular algorithms for compressing deep neural networks because it generates a compact but still powerful deep neural network for the cases of complicated situations and limited computation resources. In this study, an intelligent fault diagnosis method is developed based on the probabilistic knowledge distillation (PKD) and deep convolutional neural network (CNN) to determine the health states of bearings. First, the one-dimensional vibration signal is reshaped as a two-dimensional matrix to input the teacher or student network. Then, a deeper neural network and small network are trained as the teacher and student networks, respectively. The probability distribution (PD) is learned by minimizing the difference of the joint density probability estimation between the teacher and student networks, that is, the lightweight network learns to integrate the PD of the deeper neural network in the high-dimensional feature space and realizes the knowledge transfer from training samples to test samples. The results of experimental bearings indicate that the proposed diagnosis method has higher diagnosis accuracy than the other two popular knowledge distillation methods and its student network only has about one 700-th parameter of the teacher network. Therefore, the proposed method achieves a good balance between the classification accuracy and network compression, and demonstrates potential application to intelligent fault diagnosis of bearings under varying working conditions.\",\"PeriodicalId\":436428,\"journal\":{\"name\":\"2021 Global Reliability and Prognostics and Health Management (PHM-Nanjing)\",\"volume\":\"183 5 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-10-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 Global Reliability and Prognostics and Health Management (PHM-Nanjing)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PHM-Nanjing52125.2021.9612949\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 Global Reliability and Prognostics and Health Management (PHM-Nanjing)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PHM-Nanjing52125.2021.9612949","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

摘要

知识蒸馏(Knowledge distillation, KD)是一种常用的深度神经网络压缩算法,因为它可以在复杂的情况和有限的计算资源下生成一个紧凑而强大的深度神经网络。本文提出了一种基于概率知识蒸馏(PKD)和深度卷积神经网络(CNN)的智能故障诊断方法来确定轴承的健康状态。首先,将一维振动信号重构为二维矩阵输入到教师或学生网络中。然后,将较深的神经网络和较小的神经网络分别训练为教师网络和学生网络。通过最小化师生网络联合密度概率估计的差异来学习概率分布(PD),即轻量级网络学习在高维特征空间中整合深层神经网络的PD,实现从训练样本到测试样本的知识迁移。实验结果表明,该方法的诊断精度高于其他两种常用的知识蒸馏方法,其学生网络的参数仅为教师网络的700分之一左右。因此,该方法在分类精度和网络压缩之间取得了很好的平衡,在不同工况下的轴承智能故障诊断中具有潜在的应用前景。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
An Intelligent Bearing Fault Diagnosis based on Modified Probabilistic Knowledge Distillation
Knowledge distillation (KD) is one of popular algorithms for compressing deep neural networks because it generates a compact but still powerful deep neural network for the cases of complicated situations and limited computation resources. In this study, an intelligent fault diagnosis method is developed based on the probabilistic knowledge distillation (PKD) and deep convolutional neural network (CNN) to determine the health states of bearings. First, the one-dimensional vibration signal is reshaped as a two-dimensional matrix to input the teacher or student network. Then, a deeper neural network and small network are trained as the teacher and student networks, respectively. The probability distribution (PD) is learned by minimizing the difference of the joint density probability estimation between the teacher and student networks, that is, the lightweight network learns to integrate the PD of the deeper neural network in the high-dimensional feature space and realizes the knowledge transfer from training samples to test samples. The results of experimental bearings indicate that the proposed diagnosis method has higher diagnosis accuracy than the other two popular knowledge distillation methods and its student network only has about one 700-th parameter of the teacher network. Therefore, the proposed method achieves a good balance between the classification accuracy and network compression, and demonstrates potential application to intelligent fault diagnosis of bearings under varying working conditions.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Multi-channel Transfer Learning Framework for Fault Diagnosis of Axial Piston Pump The Effects of Constructing National Innovative Cities on Foreign Direct Investment A multi-synchrosqueezing ridge extraction transform for the analysis of non-stationary multi-component signals Fault Diagnosis Method of Analog Circuit Based on Enhanced Boundary Equilibrium Generative Adversarial Networks Remaining Useful Life Prediction of Mechanical Equipment Based on Temporal Convolutional Network and Asymmetric Loss Function
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1