{"title":"A Framework for Validation of Network-based Simulation Models: an Application to Modeling Interventions of Pandemics","authors":"Sichao Wu, H. Mortveit, Sandeep Gupta","doi":"10.1145/3064911.3064922","DOIUrl":null,"url":null,"abstract":"Network-based computer simulation models are powerful tools for analyzing and guiding policy formation related to the actual systems being modeled. However, the inherent data and computationally intensive nature of this model class gives rise to fundamental challenges when it comes to executing typical experimental designs. In particular this applies to model validation. Manual management of the complex simulation work-flows along with the associated data will often require a broad combination of skills and expertise. Examples of skills include domain expertise, mathematical modeling, programming, high-performance computing, statistical designs, data management as well as the tracking all assets and instances involved. This is a complex and error-prone process for the best of practices, and even small slips may compromise model validation and reduce human productivity in significant ways. In this paper, we present a novel framework that addresses the challenges of model validation just mentioned. The components of our framework form an ecosystem consisting of (i) model unification through a standardized model configuration format, (ii) simulation data management, (iii) support for experimental designs, and (iv) methods for uncertainty quantification, and sensitivity analysis, all ultimately supporting the process of model validation. (Note that our view of validation is much more comprehensive than simply ensuring that the computational model can reproduce instance of historical data.) This is an extensible design where domain experts from e.g. experimental design can contribute to the collection of available algorithms and methods. Additionally, our solution directly supports reproducible computational experiments and analysis, which in turn facilitates independent model verification and validation. Finally, to showcase our design concept, we provide a sensitivity analysis for examining the consequences of different intervention strategies for an influenza pandemic.","PeriodicalId":341026,"journal":{"name":"Proceedings of the 2017 ACM SIGSIM Conference on Principles of Advanced Discrete Simulation","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2017 ACM SIGSIM Conference on Principles of Advanced Discrete Simulation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3064911.3064922","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Network-based computer simulation models are powerful tools for analyzing and guiding policy formation related to the actual systems being modeled. However, the inherent data and computationally intensive nature of this model class gives rise to fundamental challenges when it comes to executing typical experimental designs. In particular this applies to model validation. Manual management of the complex simulation work-flows along with the associated data will often require a broad combination of skills and expertise. Examples of skills include domain expertise, mathematical modeling, programming, high-performance computing, statistical designs, data management as well as the tracking all assets and instances involved. This is a complex and error-prone process for the best of practices, and even small slips may compromise model validation and reduce human productivity in significant ways. In this paper, we present a novel framework that addresses the challenges of model validation just mentioned. The components of our framework form an ecosystem consisting of (i) model unification through a standardized model configuration format, (ii) simulation data management, (iii) support for experimental designs, and (iv) methods for uncertainty quantification, and sensitivity analysis, all ultimately supporting the process of model validation. (Note that our view of validation is much more comprehensive than simply ensuring that the computational model can reproduce instance of historical data.) This is an extensible design where domain experts from e.g. experimental design can contribute to the collection of available algorithms and methods. Additionally, our solution directly supports reproducible computational experiments and analysis, which in turn facilitates independent model verification and validation. Finally, to showcase our design concept, we provide a sensitivity analysis for examining the consequences of different intervention strategies for an influenza pandemic.