{"title":"利用混合卷积神经网络对增材制造中的混合数据进行不确定性量化","authors":"Jianhua Yin, Zhen Hu, X. Du","doi":"10.1115/1.4065444","DOIUrl":null,"url":null,"abstract":"\n Surrogate models have become increasingly essential for replacing simulation models in additive manufacturing (AM) process analysis and design, particularly for assessing the impact of microstructural variations and process imperfections (aleatory uncertainty). However, these surrogate models can introduce predictive errors, introducing epistemic uncertainty. The challenge arises when dealing with image input data, which is inherently high-dimensional, making it challenging to apply existing Uncertainty Quantification (UQ) techniques effectively. To address this challenge, this study develops a new UQ methodology based on an existing concept of combining Convolutional Neural Network (CNN) and Gaussian Process Regression (GPR). This CNN-GP method converts both numerical and image inputs into a unified, larger-sized image dataset, enabling direct dimension reduction with CNN. Subsequently, GPR constructs the surrogate model, not only providing predictions but also quantifying the associated model uncertainty. This approach ensures that the surrogate model considers both input-related aleatory uncertainty and model-related epistemic uncertainty when it is used for prediction, enhancing confidence in image-based AM simulations and informed decision-making. Three examples validate the high accuracy and effectiveness of the proposed method.","PeriodicalId":504755,"journal":{"name":"ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part B: Mechanical Engineering","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Uncertainty Quantification with Mixed Data by Hybrid Convolutional Neural Network for Additive Manufacturing\",\"authors\":\"Jianhua Yin, Zhen Hu, X. Du\",\"doi\":\"10.1115/1.4065444\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n Surrogate models have become increasingly essential for replacing simulation models in additive manufacturing (AM) process analysis and design, particularly for assessing the impact of microstructural variations and process imperfections (aleatory uncertainty). However, these surrogate models can introduce predictive errors, introducing epistemic uncertainty. The challenge arises when dealing with image input data, which is inherently high-dimensional, making it challenging to apply existing Uncertainty Quantification (UQ) techniques effectively. To address this challenge, this study develops a new UQ methodology based on an existing concept of combining Convolutional Neural Network (CNN) and Gaussian Process Regression (GPR). This CNN-GP method converts both numerical and image inputs into a unified, larger-sized image dataset, enabling direct dimension reduction with CNN. Subsequently, GPR constructs the surrogate model, not only providing predictions but also quantifying the associated model uncertainty. This approach ensures that the surrogate model considers both input-related aleatory uncertainty and model-related epistemic uncertainty when it is used for prediction, enhancing confidence in image-based AM simulations and informed decision-making. Three examples validate the high accuracy and effectiveness of the proposed method.\",\"PeriodicalId\":504755,\"journal\":{\"name\":\"ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part B: Mechanical Engineering\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-05-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part B: Mechanical Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1115/1.4065444\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part B: Mechanical Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1115/1.4065444","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Uncertainty Quantification with Mixed Data by Hybrid Convolutional Neural Network for Additive Manufacturing
Surrogate models have become increasingly essential for replacing simulation models in additive manufacturing (AM) process analysis and design, particularly for assessing the impact of microstructural variations and process imperfections (aleatory uncertainty). However, these surrogate models can introduce predictive errors, introducing epistemic uncertainty. The challenge arises when dealing with image input data, which is inherently high-dimensional, making it challenging to apply existing Uncertainty Quantification (UQ) techniques effectively. To address this challenge, this study develops a new UQ methodology based on an existing concept of combining Convolutional Neural Network (CNN) and Gaussian Process Regression (GPR). This CNN-GP method converts both numerical and image inputs into a unified, larger-sized image dataset, enabling direct dimension reduction with CNN. Subsequently, GPR constructs the surrogate model, not only providing predictions but also quantifying the associated model uncertainty. This approach ensures that the surrogate model considers both input-related aleatory uncertainty and model-related epistemic uncertainty when it is used for prediction, enhancing confidence in image-based AM simulations and informed decision-making. Three examples validate the high accuracy and effectiveness of the proposed method.