{"title":"用于一般结构拓扑优化的一次性训练机器学习方法","authors":"Sen-Zhen Zhan , Xinhong Shi , Xi-Qiao Feng , Zi-Long Zhao","doi":"10.1016/j.tws.2024.112595","DOIUrl":null,"url":null,"abstract":"<div><div>Machine learning (ML) methods have found some applications in structural topology optimization. In the existing methods, however, the ML models need to be retrained when the design domains and supporting conditions have been changed, posing a limitation to their wide applications. In this paper, we propose a one-time training ML (OTML) method for general topology optimization, where the self-attention convolutional long short-term memory (SaConvLSTM) model is introduced to update the design variables. An extension–division approach is used to enrich the training sets. By developing a splicing strategy, the training results of a small design space (i.e., a basic cell of either two- or three-dimensions) can be extended to tackling the optimization problem of a large design domain with arbitrary geometric shapes. Using the OTML method, the ML model needs to be trained for only one time, and the trained model can be used directly to solve various optimization problems with arbitrary shapes of design domains, loads, and boundary conditions. In the SaConvLSTM model, the material volume of the post-processed thresholded designs can be precisely controlled, though the control precision of the gray-scale designs might be slightly sacrificed. The effects of model parameters on the computational cost and the result quality are examined. Four examples are provided to demonstrate the high performance of this structural design method. For large-scale optimization problems, the present method can accelerate the structural form-finding process. This study holds a promise in the high-resolution structural form-finding and transdisciplinary computational morphogenesis.</div></div>","PeriodicalId":49435,"journal":{"name":"Thin-Walled Structures","volume":"205 ","pages":"Article 112595"},"PeriodicalIF":5.7000,"publicationDate":"2024-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A one-time training machine learning method for general structural topology optimization\",\"authors\":\"Sen-Zhen Zhan , Xinhong Shi , Xi-Qiao Feng , Zi-Long Zhao\",\"doi\":\"10.1016/j.tws.2024.112595\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Machine learning (ML) methods have found some applications in structural topology optimization. In the existing methods, however, the ML models need to be retrained when the design domains and supporting conditions have been changed, posing a limitation to their wide applications. In this paper, we propose a one-time training ML (OTML) method for general topology optimization, where the self-attention convolutional long short-term memory (SaConvLSTM) model is introduced to update the design variables. An extension–division approach is used to enrich the training sets. By developing a splicing strategy, the training results of a small design space (i.e., a basic cell of either two- or three-dimensions) can be extended to tackling the optimization problem of a large design domain with arbitrary geometric shapes. Using the OTML method, the ML model needs to be trained for only one time, and the trained model can be used directly to solve various optimization problems with arbitrary shapes of design domains, loads, and boundary conditions. In the SaConvLSTM model, the material volume of the post-processed thresholded designs can be precisely controlled, though the control precision of the gray-scale designs might be slightly sacrificed. The effects of model parameters on the computational cost and the result quality are examined. Four examples are provided to demonstrate the high performance of this structural design method. For large-scale optimization problems, the present method can accelerate the structural form-finding process. This study holds a promise in the high-resolution structural form-finding and transdisciplinary computational morphogenesis.</div></div>\",\"PeriodicalId\":49435,\"journal\":{\"name\":\"Thin-Walled Structures\",\"volume\":\"205 \",\"pages\":\"Article 112595\"},\"PeriodicalIF\":5.7000,\"publicationDate\":\"2024-10-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Thin-Walled Structures\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0263823124010358\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, CIVIL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Thin-Walled Structures","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0263823124010358","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
A one-time training machine learning method for general structural topology optimization
Machine learning (ML) methods have found some applications in structural topology optimization. In the existing methods, however, the ML models need to be retrained when the design domains and supporting conditions have been changed, posing a limitation to their wide applications. In this paper, we propose a one-time training ML (OTML) method for general topology optimization, where the self-attention convolutional long short-term memory (SaConvLSTM) model is introduced to update the design variables. An extension–division approach is used to enrich the training sets. By developing a splicing strategy, the training results of a small design space (i.e., a basic cell of either two- or three-dimensions) can be extended to tackling the optimization problem of a large design domain with arbitrary geometric shapes. Using the OTML method, the ML model needs to be trained for only one time, and the trained model can be used directly to solve various optimization problems with arbitrary shapes of design domains, loads, and boundary conditions. In the SaConvLSTM model, the material volume of the post-processed thresholded designs can be precisely controlled, though the control precision of the gray-scale designs might be slightly sacrificed. The effects of model parameters on the computational cost and the result quality are examined. Four examples are provided to demonstrate the high performance of this structural design method. For large-scale optimization problems, the present method can accelerate the structural form-finding process. This study holds a promise in the high-resolution structural form-finding and transdisciplinary computational morphogenesis.
期刊介绍:
Thin-walled structures comprises an important and growing proportion of engineering construction with areas of application becoming increasingly diverse, ranging from aircraft, bridges, ships and oil rigs to storage vessels, industrial buildings and warehouses.
Many factors, including cost and weight economy, new materials and processes and the growth of powerful methods of analysis have contributed to this growth, and led to the need for a journal which concentrates specifically on structures in which problems arise due to the thinness of the walls. This field includes cold– formed sections, plate and shell structures, reinforced plastics structures and aluminium structures, and is of importance in many branches of engineering.
The primary criterion for consideration of papers in Thin–Walled Structures is that they must be concerned with thin–walled structures or the basic problems inherent in thin–walled structures. Provided this criterion is satisfied no restriction is placed on the type of construction, material or field of application. Papers on theory, experiment, design, etc., are published and it is expected that many papers will contain aspects of all three.