S. Gulyanon, Somrudee Deepaisam, Chayud Srisumarnk, Nattapol Chiewnawintawat, Angkoon Anzkoonsawaenasuk, Seksan Laitrakun, Pakorn Ooaorakasit, P. Rakpongsiri, Thawanpat Meechamnan, D. Sompongse
{"title":"A Comparative Study of Noise Augmentation and Deep Learning Methods on Raman Spectral Classification of Contamination in Hard Disk Drive","authors":"S. Gulyanon, Somrudee Deepaisam, Chayud Srisumarnk, Nattapol Chiewnawintawat, Angkoon Anzkoonsawaenasuk, Seksan Laitrakun, Pakorn Ooaorakasit, P. Rakpongsiri, Thawanpat Meechamnan, D. Sompongse","doi":"10.1109/iSAI-NLP56921.2022.9960277","DOIUrl":null,"url":null,"abstract":"Deep neural networks have become state-of-the-art for many tasks in the past decade, especially Raman spectral classification. However, these networks heavily rely on a large collection of labeled data to avoid overfitting. Although labeled data is scarce in many application domains, there are techniques to help alleviate the problem, such as data augmentation. In this paper, we investigate one particular kind of data augmentation, noise augmentation that simply adds noise to input samples, for the Raman spectra classification task. Raman spectra yield fingerprint-like information about all chemical components but are prone to noise when the material's particles are small. We study the effectiveness of three noise models for noise augmen-tation in building a robust classification model, including noise from the background chemicals, extended multiplicative signal augmentation (EMSA), and statistical noises. In the experiments, we compared the performance of 11 popular deep learning models with the three noise augmentation techniques. The results suggest that RNN-based models perform relatively well with the increase in augmented data size compared to CNN-based models and that robust noise augmentation methods require characteristics of random variations. However, hyperparameter optimization is crucial for taking optimal advantage of noise augmentation.","PeriodicalId":399019,"journal":{"name":"2022 17th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)","volume":"87 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 17th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iSAI-NLP56921.2022.9960277","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Deep neural networks have become state-of-the-art for many tasks in the past decade, especially Raman spectral classification. However, these networks heavily rely on a large collection of labeled data to avoid overfitting. Although labeled data is scarce in many application domains, there are techniques to help alleviate the problem, such as data augmentation. In this paper, we investigate one particular kind of data augmentation, noise augmentation that simply adds noise to input samples, for the Raman spectra classification task. Raman spectra yield fingerprint-like information about all chemical components but are prone to noise when the material's particles are small. We study the effectiveness of three noise models for noise augmen-tation in building a robust classification model, including noise from the background chemicals, extended multiplicative signal augmentation (EMSA), and statistical noises. In the experiments, we compared the performance of 11 popular deep learning models with the three noise augmentation techniques. The results suggest that RNN-based models perform relatively well with the increase in augmented data size compared to CNN-based models and that robust noise augmentation methods require characteristics of random variations. However, hyperparameter optimization is crucial for taking optimal advantage of noise augmentation.