{"title":"多问题自适应分解与增量超参数调优","authors":"Jialin Liu, X. Yao","doi":"10.1109/SSCI44817.2019.9002966","DOIUrl":null,"url":null,"abstract":"The Capacitated Arc Routing Problem (CARP) is a NP-hard combinatorial optimisation problem with numerous real-world applications. Several divide-and-conquer approaches, controlled by one or more hyperparameters, have been proposed to tackle large-scale CARPs. The tuning of hyperparameters can be computationally expensive due to the lack of priori knowledge, the size of the configuration space, and the time required for solving a CARP instance. Motivated by this time consuming task, we propose a scalable approach based on self-adaptive hierarchical decomposition (SASAHiD) to scale up existing methods. We take a state-of-the-art decomposition method for large-scale CARPs called SAHiD as an example to carry out experiments on two sets of real-world CARP instances with hundreds to thousands of tasks. The results demonstrate that SASAHiD outperforms SAHiD significantly with fewer hyperparameters, thus the dimension of associated configuration space is reduced. Moreover, we propose an incremental hyperparameter tuning approach across multiple problem instances to learn the hyperparameters of SASAHiD on a set of instances with different sizes. SASAHiD with optimised hyperparameters achieves better or competitive results with the SASAHiD using default hyperparameters when solving problem instances that it has never seen in the training set.","PeriodicalId":6729,"journal":{"name":"2019 IEEE Symposium Series on Computational Intelligence (SSCI)","volume":"48 1","pages":"1590-1597"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Self-adaptive Decomposition and Incremental Hyperparameter Tuning Across Multiple Problems\",\"authors\":\"Jialin Liu, X. Yao\",\"doi\":\"10.1109/SSCI44817.2019.9002966\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Capacitated Arc Routing Problem (CARP) is a NP-hard combinatorial optimisation problem with numerous real-world applications. Several divide-and-conquer approaches, controlled by one or more hyperparameters, have been proposed to tackle large-scale CARPs. The tuning of hyperparameters can be computationally expensive due to the lack of priori knowledge, the size of the configuration space, and the time required for solving a CARP instance. Motivated by this time consuming task, we propose a scalable approach based on self-adaptive hierarchical decomposition (SASAHiD) to scale up existing methods. We take a state-of-the-art decomposition method for large-scale CARPs called SAHiD as an example to carry out experiments on two sets of real-world CARP instances with hundreds to thousands of tasks. The results demonstrate that SASAHiD outperforms SAHiD significantly with fewer hyperparameters, thus the dimension of associated configuration space is reduced. Moreover, we propose an incremental hyperparameter tuning approach across multiple problem instances to learn the hyperparameters of SASAHiD on a set of instances with different sizes. SASAHiD with optimised hyperparameters achieves better or competitive results with the SASAHiD using default hyperparameters when solving problem instances that it has never seen in the training set.\",\"PeriodicalId\":6729,\"journal\":{\"name\":\"2019 IEEE Symposium Series on Computational Intelligence (SSCI)\",\"volume\":\"48 1\",\"pages\":\"1590-1597\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE Symposium Series on Computational Intelligence (SSCI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SSCI44817.2019.9002966\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE Symposium Series on Computational Intelligence (SSCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSCI44817.2019.9002966","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Self-adaptive Decomposition and Incremental Hyperparameter Tuning Across Multiple Problems
The Capacitated Arc Routing Problem (CARP) is a NP-hard combinatorial optimisation problem with numerous real-world applications. Several divide-and-conquer approaches, controlled by one or more hyperparameters, have been proposed to tackle large-scale CARPs. The tuning of hyperparameters can be computationally expensive due to the lack of priori knowledge, the size of the configuration space, and the time required for solving a CARP instance. Motivated by this time consuming task, we propose a scalable approach based on self-adaptive hierarchical decomposition (SASAHiD) to scale up existing methods. We take a state-of-the-art decomposition method for large-scale CARPs called SAHiD as an example to carry out experiments on two sets of real-world CARP instances with hundreds to thousands of tasks. The results demonstrate that SASAHiD outperforms SAHiD significantly with fewer hyperparameters, thus the dimension of associated configuration space is reduced. Moreover, we propose an incremental hyperparameter tuning approach across multiple problem instances to learn the hyperparameters of SASAHiD on a set of instances with different sizes. SASAHiD with optimised hyperparameters achieves better or competitive results with the SASAHiD using default hyperparameters when solving problem instances that it has never seen in the training set.