Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822174
Timothy Sprock, L. McGinnis
Simulation optimization tools have the potential to provide an unprecedented level of support for the design and execution of operational control in Discrete Event Logistics Systems (DELS). While much of the simulation optimization literature has focused on developing and exploiting integration and syntactical interoperability between simulation and optimization tools, maximizing the effectiveness of these tools to support the design and execution of control behavior requires an even greater degree of interoperability than the current state of the art. In this paper, we propose a modeling methodology for operational control decision-making that can improve the interoperability between these two analysis methods and their associated tools in the context of DELS control. This methodology establishes a standard definition of operational control for both simulation and optimization methods and defines a mapping between decision variables (optimization) and execution mechanisms (simulation / base system). The goal is a standard for creating conforming simulation and optimization tools that are capable of meeting the functional needs of operational control decision making in DELS.
{"title":"Simulation optimization in Discrete Event Logistics Systems: The challenge of operational control","authors":"Timothy Sprock, L. McGinnis","doi":"10.1109/WSC.2016.7822174","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822174","url":null,"abstract":"Simulation optimization tools have the potential to provide an unprecedented level of support for the design and execution of operational control in Discrete Event Logistics Systems (DELS). While much of the simulation optimization literature has focused on developing and exploiting integration and syntactical interoperability between simulation and optimization tools, maximizing the effectiveness of these tools to support the design and execution of control behavior requires an even greater degree of interoperability than the current state of the art. In this paper, we propose a modeling methodology for operational control decision-making that can improve the interoperability between these two analysis methods and their associated tools in the context of DELS control. This methodology establishes a standard definition of operational control for both simulation and optimization methods and defines a mapping between decision variables (optimization) and execution mechanisms (simulation / base system). The goal is a standard for creating conforming simulation and optimization tools that are capable of meeting the functional needs of operational control decision making in DELS.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121487320","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Agent based simulation modelers have found it difficult to build grounded fine grained simulation models of human behavior. By grounded we mean that the model elements must rest on valid observations of the real world, by fine grained we mean the ability to factor in multiple dimensions of behavior such as personality, affect and stress. In this paper, we present a set of guidelines to build such models that use fragments of behavior mined from past literature in the social sciences as well as behavioral studies conducted in the field. The behavior fragments serve as the building blocks to compose grounded fine grained behavior models. The models can be used in simulations for studying the dynamics of any set of behavioral dimensions in some situation of interest. These guidelines are a result of our experience with creating a fine grained simulation model of a support services organization.
{"title":"Towards fine grained human behaviour simulation models","authors":"Meghendra Singh, Mayuri Duggirala, Harshal G. Hayatnagarkar, Sachin Patel, Vivek Balaraman","doi":"10.1109/WSC.2016.7822375","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822375","url":null,"abstract":"Agent based simulation modelers have found it difficult to build grounded fine grained simulation models of human behavior. By grounded we mean that the model elements must rest on valid observations of the real world, by fine grained we mean the ability to factor in multiple dimensions of behavior such as personality, affect and stress. In this paper, we present a set of guidelines to build such models that use fragments of behavior mined from past literature in the social sciences as well as behavioral studies conducted in the field. The behavior fragments serve as the building blocks to compose grounded fine grained behavior models. The models can be used in simulations for studying the dynamics of any set of behavioral dimensions in some situation of interest. These guidelines are a result of our experience with creating a fine grained simulation model of a support services organization.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114684964","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822258
Michael D. Seminelli, James W. Wilson, Brandon M. McConnell
As the tempo of military operations slows, Army Medical Facilities are faced with a need to improve the efficiency of their clinics to provide timely service to the growing population of Soldiers who are spending more time at home station. Discrete event simulation was used to examine six scheduling and staffing policies for the Womack Army Medical Center's Optometry Clinic with a goal of increasing the daily patient throughput of the clinic with consideration to patient waiting times. The best policy increased clinic throughput by eight patients a day, generating an additional $314,000 in Relative Value Units (RVUs) annually, while only increasing patient wait times by 26%. As a minimum, increasing the walk-in provider's scheduled patient load by two enables the provider to optimally treat both scheduled and walk-in patients, with a $94,000 annual RVU increase. Implementation of these results will improve clinic performance, revenue, and increase Soldiers' access to care.
{"title":"Implementing discrete event simulation to improve Optometry Clinic operations","authors":"Michael D. Seminelli, James W. Wilson, Brandon M. McConnell","doi":"10.1109/WSC.2016.7822258","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822258","url":null,"abstract":"As the tempo of military operations slows, Army Medical Facilities are faced with a need to improve the efficiency of their clinics to provide timely service to the growing population of Soldiers who are spending more time at home station. Discrete event simulation was used to examine six scheduling and staffing policies for the Womack Army Medical Center's Optometry Clinic with a goal of increasing the daily patient throughput of the clinic with consideration to patient waiting times. The best policy increased clinic throughput by eight patients a day, generating an additional $314,000 in Relative Value Units (RVUs) annually, while only increasing patient wait times by 26%. As a minimum, increasing the walk-in provider's scheduled patient load by two enables the provider to optimally treat both scheduled and walk-in patients, with a $94,000 annual RVU increase. Implementation of these results will improve clinic performance, revenue, and increase Soldiers' access to care.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124069806","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822075
O. Gutsche
The Large Hadron Collider (LHC) at CERN in Geneva, Switzerland, is one of the largest machines on this planet. It is built to smash protons into each other at unprecedented energies to reveal the fundamental constituents of our universe. The 4 detectors at the LHC record multi-petabyte datasets every year. The scientific analysis of this data requires equally large simulation datasets of the collisions based on the theory of particle physics, the Standard Model. The goal is to verify the validity of the Standard Model or of theories that extend the Model like the concepts of Super Symmetry and an explanation of Dark Matter. I will give an overview of the nature of simulations needed to discover new particles like the Higgs boson in 2012, and review the different areas where simulations are indispensable: from the actual recording of the collisions to the extraction of scientific results to the conceptual design of improvements to the LHC.
{"title":"Dark matter and Super Symmetry: Exploring and explaining the universe with simulations at the LHC","authors":"O. Gutsche","doi":"10.1109/WSC.2016.7822075","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822075","url":null,"abstract":"The Large Hadron Collider (LHC) at CERN in Geneva, Switzerland, is one of the largest machines on this planet. It is built to smash protons into each other at unprecedented energies to reveal the fundamental constituents of our universe. The 4 detectors at the LHC record multi-petabyte datasets every year. The scientific analysis of this data requires equally large simulation datasets of the collisions based on the theory of particle physics, the Standard Model. The goal is to verify the validity of the Standard Model or of theories that extend the Model like the concepts of Super Symmetry and an explanation of Dark Matter. I will give an overview of the nature of simulations needed to discover new particles like the Higgs boson in 2012, and review the different areas where simulations are indispensable: from the actual recording of the collisions to the extraction of scientific results to the conceptual design of improvements to the LHC.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124092362","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822277
A. Greenwood, T. Hill, Chase Saunders, R. Holt
Since most material that is input to a manufacturing process is transported via multiple modes of transportation, oftentimes over long distance, the sourcing decision has a major impact on enterprise performance, in terms of cost, timeliness, quality, etc. Critical elements of those decisions include specifying from where to acquire the material, in what quantity, the modes that should be used, etc.
{"title":"Modeling and analysis of intermodal supply paths to enhance sourcing decisions","authors":"A. Greenwood, T. Hill, Chase Saunders, R. Holt","doi":"10.1109/WSC.2016.7822277","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822277","url":null,"abstract":"Since most material that is input to a manufacturing process is transported via multiple modes of transportation, oftentimes over long distance, the sourcing decision has a major impact on enterprise performance, in terms of cost, timeliness, quality, etc. Critical elements of those decisions include specifying from where to acquire the material, in what quantity, the modes that should be used, etc.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127559279","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822121
S. Shashaani, S. R. Hunter, R. Pasupathy
ASTRO-DF is a class of adaptive sampling algorithms for solving simulation optimization problems in which only estimates of the objective function are available by executing a Monte Carlo simulation. ASTRO-DF algorithms are iterative trust-region algorithms, where a local model is repeatedly constructed and optimized as iterates evolve through the search space. The ASTRO-DF class of algorithms is derivative-free in the sense that it does not rely on direct observations of the function derivatives. A salient feature of ASTRO-DF is the incorporation of adaptive sampling and replication to keep the model error and the trust-region radius in lock-step, to ensure efficiency. ASTRO-DF has been demonstrated to generate iterates that globally converge to a first-order critical point with probability one. In this paper, we describe and list ASTRO-DF, and discuss key heuristics that ensure good finite-time performance. We report our numerical experience with ASTRO-DF on test problems in low to moderate dimensions.
{"title":"ASTRO-DF: Adaptive sampling trust-region optimization algorithms, heuristics, and numerical experience","authors":"S. Shashaani, S. R. Hunter, R. Pasupathy","doi":"10.1109/WSC.2016.7822121","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822121","url":null,"abstract":"ASTRO-DF is a class of adaptive sampling algorithms for solving simulation optimization problems in which only estimates of the objective function are available by executing a Monte Carlo simulation. ASTRO-DF algorithms are iterative trust-region algorithms, where a local model is repeatedly constructed and optimized as iterates evolve through the search space. The ASTRO-DF class of algorithms is derivative-free in the sense that it does not rely on direct observations of the function derivatives. A salient feature of ASTRO-DF is the incorporation of adaptive sampling and replication to keep the model error and the trust-region radius in lock-step, to ensure efficiency. ASTRO-DF has been demonstrated to generate iterates that globally converge to a first-order critical point with probability one. In this paper, we describe and list ASTRO-DF, and discuss key heuristics that ensure good finite-time performance. We report our numerical experience with ASTRO-DF on test problems in low to moderate dimensions.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"os-13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127764056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822412
Mengyi Zhang, A. Matta, A. Alfieri, Giulia Pedrielli
The stochastic Buffer Allocation Problem (BAP) is well known in several fields and it has been characterized as NP-Hard. It deals with the optimal allocation of buffer spaces among stages of a system. Simulation Optimization is a possible way to approximately solve the problem. In particular, we refer to the Discrete Event Optimization (DEO). According to this approach, BAP simulation optimization can be modeled as a Mixed Integer Programming model. Despite the advantages deriving from having a single model for both simulation and optimization, its solution can be extremely demanding. In this work, we propose a Benders decomposition approach to efficiently solve large DEO of BAP, in which cuts are generated by simulation. Numerical experiment shows that the computation time can be significantly reduced by using this approach.
{"title":"A simulation based cut generation approach to improve DEO efficiency: The Buffer Allocation case","authors":"Mengyi Zhang, A. Matta, A. Alfieri, Giulia Pedrielli","doi":"10.1109/WSC.2016.7822412","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822412","url":null,"abstract":"The stochastic Buffer Allocation Problem (BAP) is well known in several fields and it has been characterized as NP-Hard. It deals with the optimal allocation of buffer spaces among stages of a system. Simulation Optimization is a possible way to approximately solve the problem. In particular, we refer to the Discrete Event Optimization (DEO). According to this approach, BAP simulation optimization can be modeled as a Mixed Integer Programming model. Despite the advantages deriving from having a single model for both simulation and optimization, its solution can be extremely demanding. In this work, we propose a Benders decomposition approach to efficiently solve large DEO of BAP, in which cuts are generated by simulation. Numerical experiment shows that the computation time can be significantly reduced by using this approach.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"51 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131653618","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822175
J. Sarli, H. Leone, María de los Milagros Gutiérrez
Distributed simulation becomes a suitable tool for simulating complex systems with heterogeneous models, as supply chains, mainly due to the modularity of components. High Level Architecture (HLA) is widely used as a standard to build a distributed simulation system. However, the composability of simulation models in a federation scheme is the main problem to be overcome. Most solutions propose conceptual modeling for developing federation. This work presents an ontology network to conceptualize different domains, taking into account the design of a simulation model for a supply chain in a distributed environment. The purpose of using an ontology network is the possibility of developing a conceptual model with a modular and incremental approach. The considered domains are: data model domain, federation domain, supply chain domain, and enterprise model domain.
{"title":"Ontology-based semantic model of supply chains for modeling and simulation in distributed environment","authors":"J. Sarli, H. Leone, María de los Milagros Gutiérrez","doi":"10.1109/WSC.2016.7822175","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822175","url":null,"abstract":"Distributed simulation becomes a suitable tool for simulating complex systems with heterogeneous models, as supply chains, mainly due to the modularity of components. High Level Architecture (HLA) is widely used as a standard to build a distributed simulation system. However, the composability of simulation models in a federation scheme is the main problem to be overcome. Most solutions propose conceptual modeling for developing federation. This work presents an ontology network to conceptualize different domains, taking into account the design of a simulation model for a supply chain in a distributed environment. The purpose of using an ontology network is the possibility of developing a conceptual model with a modular and incremental approach. The considered domains are: data model domain, federation domain, supply chain domain, and enterprise model domain.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133317436","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A simulation model is often used as a proxy for the real system of interest in a decision-making process. However, no simulation model is totally representative of the reality. The impact of the model inadequacy on the prediction of system performance should be carefully assessed. We propose a new metamodeling approach to simultaneously characterize both the simulation model and its model inadequacy. Our approach utilizes both simulation outputs and real data to predict system performance, and accounts for four types of uncertainty that arise from the unknown performance measure of the simulation model, simulation errors, unknown model inadequacy, and observation errors of the real system, respectively. Numerical results show that the new approach provides more accurate predictions in general.
{"title":"Simulation metamodeling in the presence of model inadequacy","authors":"Xiaowei Zhang, Lu Zou","doi":"10.5555/3042094.3042178","DOIUrl":"https://doi.org/10.5555/3042094.3042178","url":null,"abstract":"A simulation model is often used as a proxy for the real system of interest in a decision-making process. However, no simulation model is totally representative of the reality. The impact of the model inadequacy on the prediction of system performance should be carefully assessed. We propose a new metamodeling approach to simultaneously characterize both the simulation model and its model inadequacy. Our approach utilizes both simulation outputs and real data to predict system performance, and accounts for four types of uncertainty that arise from the unknown performance measure of the simulation model, simulation errors, unknown model inadequacy, and observation errors of the real system, respectively. Numerical results show that the new approach provides more accurate predictions in general.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"388 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132630860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822221
Elie Azar, Ahmed Al Amoodi
Actions taken by building occupants and facility managers can have significant impacts on building energy performance. Despite the growing interesting in understanding human drivers of energy consumption, literature on the topic remains limited and is mostly focused on studying individual occupancy actions (e.g., changing thermostat set point temperatures). Consequently, the impact of uncertainty in human actions on overall building performance remains unclear. This paper proposes a novel method to quantify the impact of potential uncertainty in various operation actions on building performance, using a combination of Monte Carlo and Fractional Factorial analyses. The framework is illustrated in a case study on educational buildings, where deviations from base case energy intensity levels exceed 50 kWh/m2/year in some cases. The main contributors to this variation are the thermostat temperature set point settings, followed by the consumption patterns of equipment and lighting systems by occupants during unoccupied periods.
{"title":"Quantifying the impact of uncertainty in human actions on the energy performance of educational buildings","authors":"Elie Azar, Ahmed Al Amoodi","doi":"10.1109/WSC.2016.7822221","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822221","url":null,"abstract":"Actions taken by building occupants and facility managers can have significant impacts on building energy performance. Despite the growing interesting in understanding human drivers of energy consumption, literature on the topic remains limited and is mostly focused on studying individual occupancy actions (e.g., changing thermostat set point temperatures). Consequently, the impact of uncertainty in human actions on overall building performance remains unclear. This paper proposes a novel method to quantify the impact of potential uncertainty in various operation actions on building performance, using a combination of Monte Carlo and Fractional Factorial analyses. The framework is illustrated in a case study on educational buildings, where deviations from base case energy intensity levels exceed 50 kWh/m2/year in some cases. The main contributors to this variation are the thermostat temperature set point settings, followed by the consumption patterns of equipment and lighting systems by occupants during unoccupied periods.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"196 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115507333","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}