{"title":"Sample Augmentation Using Enhanced Auxiliary Classifier Generative Adversarial Network by Transformer for Railway Freight Train Wheelset Bearing Fault Diagnosis.","authors":"Jing Zhao, Junfeng Li, Zonghao Yuan, Tianming Mu, Zengqiang Ma, Suyan Liu","doi":"10.3390/e26121113","DOIUrl":null,"url":null,"abstract":"<p><p>Diagnosing faults in wheelset bearings is critical for train safety. The main challenge is that only a limited amount of fault sample data can be obtained during high-speed train operations. This scarcity of samples impacts the training and accuracy of deep learning models for wheelset bearing fault diagnosis. Studies show that the Auxiliary Classifier Generative Adversarial Network (ACGAN) demonstrates promising performance in addressing this issue. However, existing ACGAN models have drawbacks such as complexity, high computational expenses, mode collapse, and vanishing gradients. Aiming to address these issues, this paper presents the Transformer and Auxiliary Classifier Generative Adversarial Network (TACGAN), which increases the diversity, complexity and entropy of generated samples, and maximizes the entropy of the generated samples. The transformer network replaces traditional convolutional neural networks (CNNs), avoiding iterative and convolutional structures, thereby reducing computational expenses. Moreover, an independent classifier is integrated to prevent the coupling problem, where the discriminator is simultaneously identified and classified in the ACGAN. Finally, the Wasserstein distance is employed in the loss function to mitigate mode collapse and vanishing gradients. Experimental results using the train wheelset bearing datasets demonstrate the accuracy and effectiveness of the TACGAN.</p>","PeriodicalId":11694,"journal":{"name":"Entropy","volume":"26 12","pages":""},"PeriodicalIF":2.1000,"publicationDate":"2024-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11675503/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Entropy","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.3390/e26121113","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Diagnosing faults in wheelset bearings is critical for train safety. The main challenge is that only a limited amount of fault sample data can be obtained during high-speed train operations. This scarcity of samples impacts the training and accuracy of deep learning models for wheelset bearing fault diagnosis. Studies show that the Auxiliary Classifier Generative Adversarial Network (ACGAN) demonstrates promising performance in addressing this issue. However, existing ACGAN models have drawbacks such as complexity, high computational expenses, mode collapse, and vanishing gradients. Aiming to address these issues, this paper presents the Transformer and Auxiliary Classifier Generative Adversarial Network (TACGAN), which increases the diversity, complexity and entropy of generated samples, and maximizes the entropy of the generated samples. The transformer network replaces traditional convolutional neural networks (CNNs), avoiding iterative and convolutional structures, thereby reducing computational expenses. Moreover, an independent classifier is integrated to prevent the coupling problem, where the discriminator is simultaneously identified and classified in the ACGAN. Finally, the Wasserstein distance is employed in the loss function to mitigate mode collapse and vanishing gradients. Experimental results using the train wheelset bearing datasets demonstrate the accuracy and effectiveness of the TACGAN.
期刊介绍:
Entropy (ISSN 1099-4300), an international and interdisciplinary journal of entropy and information studies, publishes reviews, regular research papers and short notes. Our aim is to encourage scientists to publish as much as possible their theoretical and experimental details. There is no restriction on the length of the papers. If there are computation and the experiment, the details must be provided so that the results can be reproduced.