Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822248
Gabriela Martinez, T. Huschka, M. Sir, K. Pasupathy
This paper presents a scheduling policy that aims to reduce patient wait time for surgical treatment by coordinating clinical and surgical appointments. This study is of interest since the lack of coordination of these resources could lead to an inefficient utilization of available capacity, and most importantly, could cause delays in patient access to surgical treatment. A simulation model is used to analyze the impact of the policy on patient access and surgical throughput.
{"title":"A coordinated scheduling policy to improve patient access to surgical services","authors":"Gabriela Martinez, T. Huschka, M. Sir, K. Pasupathy","doi":"10.1109/WSC.2016.7822248","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822248","url":null,"abstract":"This paper presents a scheduling policy that aims to reduce patient wait time for surgical treatment by coordinating clinical and surgical appointments. This study is of interest since the lack of coordination of these resources could lead to an inefficient utilization of available capacity, and most importantly, could cause delays in patient access to surgical treatment. A simulation model is used to analyze the impact of the policy on patient access and surgical throughput.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123994472","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822292
Can Sun, H. Ehm, T. Rose
In the volatile semiconductor market, leading semiconductor manufacturers aim to keep their competitive advantage by providing better customization. In light of this situation, various technologies are proposed but complexity may also increase. This paper attempts to select the best strategy from the complexity perspective. We borrow the theory of change management and view each new technology as a change to the as-is one. A generic framework to decide the best approach via complexity measurement is proposed. It is applied to a case study with three technologies (shared reticle, compound lot and a combination of both), and for each one we analyze its change impact and increased complexity. This paper delivers both, a guideline on how to build up a complexity index to supplement the cost and benefits analysis, and its practical application to the decision making process to handle small volume production.
{"title":"Evaluation of small volume production solutions in semiconductor manufacturing: Analysis from a complexity perspective","authors":"Can Sun, H. Ehm, T. Rose","doi":"10.1109/WSC.2016.7822292","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822292","url":null,"abstract":"In the volatile semiconductor market, leading semiconductor manufacturers aim to keep their competitive advantage by providing better customization. In light of this situation, various technologies are proposed but complexity may also increase. This paper attempts to select the best strategy from the complexity perspective. We borrow the theory of change management and view each new technology as a change to the as-is one. A generic framework to decide the best approach via complexity measurement is proposed. It is applied to a case study with three technologies (shared reticle, compound lot and a combination of both), and for each one we analyze its change impact and increased complexity. This paper delivers both, a guideline on how to build up a complexity index to supplement the cost and benefits analysis, and its practical application to the decision making process to handle small volume production.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"68 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124296169","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822182
A. Hill
Norovirus is a highly contagious gastrointestinal illness that causes the rapid onset of vomiting, diarrhea and fever. The virus relies on fecal-oral transmission making children particularly susceptible because of their increased incidence of hand-to-mouth contact. Side effects from the virus' symptoms can be problematic for children, i.e. severe dehydration. This paper examines transmission of the virus among elementary school classrooms, evaluating policies to reduce the number of children who become infected. The model focuses on the daily activities that allow for students' exposure to the virus including classroom activities and lunch/recess. Two policies that limit the amount of student-student interaction and were derived from guidelines published by the Center for Disease Control were explored. The results demonstrated that implementation of either policy helps reduce the number of students who become ill and that the sooner the policy is implemented the shorter the duration of the outbreak.
诺如病毒是一种高度传染性的胃肠道疾病,可引起快速发作的呕吐、腹泻和发烧。该病毒依靠粪口传播,使儿童特别容易感染,因为他们手口接触的发生率增加。病毒症状的副作用可能会给儿童带来问题,例如严重脱水。本文研究了病毒在小学教室中的传播,评估了减少儿童感染人数的政策。该模型侧重于允许学生接触病毒的日常活动,包括课堂活动和午餐/休息。研究人员探索了两项限制学生之间互动数量的政策,这些政策来自疾病控制中心(Center for Disease Control)发布的指南。结果表明,任何一项政策的实施都有助于减少学生生病的人数,而且政策实施得越早,疫情持续的时间就越短。
{"title":"Norovirus outbreaks: Using agent-based modeling to evaluate school policies","authors":"A. Hill","doi":"10.1109/WSC.2016.7822182","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822182","url":null,"abstract":"Norovirus is a highly contagious gastrointestinal illness that causes the rapid onset of vomiting, diarrhea and fever. The virus relies on fecal-oral transmission making children particularly susceptible because of their increased incidence of hand-to-mouth contact. Side effects from the virus' symptoms can be problematic for children, i.e. severe dehydration. This paper examines transmission of the virus among elementary school classrooms, evaluating policies to reduce the number of children who become infected. The model focuses on the daily activities that allow for students' exposure to the virus including classroom activities and lunch/recess. Two policies that limit the amount of student-student interaction and were derived from guidelines published by the Center for Disease Control were explored. The results demonstrated that implementation of either policy helps reduce the number of students who become ill and that the sooner the policy is implemented the shorter the duration of the outbreak.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122720092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822104
Lucy E. Morgan, A. Titman, D. Worthington, B. Nelson
Input uncertainty (IU) is the outcome of driving simulation models using input distributions estimated by finite amounts of real-world data. Methods have been presented for quantifying IU when stationary input distributions are used. In this paper we extend upon this work and provide two methods for quantifying IU in simulation models driven by piecewise-constant non-stationary Poisson arrival processes. Numerical evaluation and illustrations of the methods are provided and indicate that the methods perform well.
{"title":"Input uncertainty quantification for simulation models with piecewise-constant non-stationary Poisson arrival processes","authors":"Lucy E. Morgan, A. Titman, D. Worthington, B. Nelson","doi":"10.1109/WSC.2016.7822104","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822104","url":null,"abstract":"Input uncertainty (IU) is the outcome of driving simulation models using input distributions estimated by finite amounts of real-world data. Methods have been presented for quantifying IU when stationary input distributions are used. In this paper we extend upon this work and provide two methods for quantifying IU in simulation models driven by piecewise-constant non-stationary Poisson arrival processes. Numerical evaluation and illustrations of the methods are provided and indicate that the methods perform well.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"199 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128390433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822201
Soroosh Gholami, H. Sarjoughian
This paper proposes a multi-resolution co-design modeling approach where hardware and software parts of systems are loosely represented and composable. This approach is shown for Network-on-Chips (NoC) where the network software directs communications among switches, links, and interfaces. The complexity of such systems can be better tamed by modeling frameworks for which multi-resolution model abstractions along system's hardware and software dimensions are separately specified. Such frameworks build on hierarchical, component-based modeling principles and methods. Hybrid model composition establishes relationships across models while multi-resolution models can be better specified by separately accounting for multiple levels of hardware and software abstractions. For Network-on-Chip, the abstraction levels are interface, capacity, flit, and hardware with resolutions defined in terms of object, temporal, process, and spatial aspects. The proposed modeling approach benefits from co-design and multi-resolution modeling in order to better manage rich dynamics of hardware and software parts of systems and their network-based interactions.
{"title":"Multi-resolution co-design modeling: A Network-on-Chip model","authors":"Soroosh Gholami, H. Sarjoughian","doi":"10.1109/WSC.2016.7822201","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822201","url":null,"abstract":"This paper proposes a multi-resolution co-design modeling approach where hardware and software parts of systems are loosely represented and composable. This approach is shown for Network-on-Chips (NoC) where the network software directs communications among switches, links, and interfaces. The complexity of such systems can be better tamed by modeling frameworks for which multi-resolution model abstractions along system's hardware and software dimensions are separately specified. Such frameworks build on hierarchical, component-based modeling principles and methods. Hybrid model composition establishes relationships across models while multi-resolution models can be better specified by separately accounting for multiple levels of hardware and software abstractions. For Network-on-Chip, the abstraction levels are interface, capacity, flit, and hardware with resolutions defined in terms of object, temporal, process, and spatial aspects. The proposed modeling approach benefits from co-design and multi-resolution modeling in order to better manage rich dynamics of hardware and software parts of systems and their network-based interactions.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130601597","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822142
H. Lam, Huajie Qian
We study the empirical likelihood method in constructing statistically accurate confidence bounds for stochastic simulation under nonparametric input uncertainty. The approach is based on positing a pair of distributionally robust optimization, with a suitably averaged divergence constraint over the uncertain input distributions, and calibrated with a χ2-quantile to provide asymptotic coverage guarantees. We present the theory giving rise to the constraint and the calibration. We also analyze the performance of our stochastic optimization algorithm. We numerically compare our approach with existing standard methods such as the bootstrap.
{"title":"The empirical likelihood approach to simulation input uncertainty","authors":"H. Lam, Huajie Qian","doi":"10.1109/WSC.2016.7822142","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822142","url":null,"abstract":"We study the empirical likelihood method in constructing statistically accurate confidence bounds for stochastic simulation under nonparametric input uncertainty. The approach is based on positing a pair of distributionally robust optimization, with a suitably averaged divergence constraint over the uncertain input distributions, and calibrated with a χ2-quantile to provide asymptotic coverage guarantees. We present the theory giving rise to the constraint and the calibration. We also analyze the performance of our stochastic optimization algorithm. We numerically compare our approach with existing standard methods such as the bootstrap.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130629225","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822343
Sung-Gil Ko, Woo-Seop Yun, Tae-Eog Lee
The objective of tactical level chemical defense operations is to protect forces from chemical attack and restore combat power. To accomplish the objective of chemical defense, combat units, higher level command, chemical protective weapons and support units must perform their respective roles and also cooperate with each other. The aim of this study is to the evaluate the effect of factors affecting chemical operations. This study presents a chemical defense operations model using a DEVS formalism and its virtual experiments. The virtual experiments evaluated protection effectiveness by varying chemical operation factors such as 1) detection range, 2) MOPP transition time, 3) NBC report make-up time, 4) report transmission time, and 5) chemical reconnaissance patrol time. The results of the experiments showed that chemical reconnaissance patrol time and communication time are as important as detection range in terms of strength preservation.
{"title":"Modeling and simulation-based analysis of effectiveness of tactical level chemical defense operations","authors":"Sung-Gil Ko, Woo-Seop Yun, Tae-Eog Lee","doi":"10.1109/WSC.2016.7822343","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822343","url":null,"abstract":"The objective of tactical level chemical defense operations is to protect forces from chemical attack and restore combat power. To accomplish the objective of chemical defense, combat units, higher level command, chemical protective weapons and support units must perform their respective roles and also cooperate with each other. The aim of this study is to the evaluate the effect of factors affecting chemical operations. This study presents a chemical defense operations model using a DEVS formalism and its virtual experiments. The virtual experiments evaluated protection effectiveness by varying chemical operation factors such as 1) detection range, 2) MOPP transition time, 3) NBC report make-up time, 4) report transmission time, and 5) chemical reconnaissance patrol time. The results of the experiments showed that chemical reconnaissance patrol time and communication time are as important as detection range in terms of strength preservation.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130652923","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822102
H. Yao, L. Rojas-Nandayapa, T. Taimre
We consider the problem of estimating tail probabilities of random sums of infinite mixtures of phase-type (IMPH) distributions—a class of distributions corresponding to random variables which can be represented as a product of an arbitrary random variable with a classical phase-type distribution. Our motivation arises from applications in risk and queueing problems. Classical rare-event simulation algorithms cannot be implemented in this setting because these typically rely on the availability of the CDF or the MGF, but these are difficult to compute or not even available for the class of IMPH distributions. In this paper, we address these issues and propose alternative simulation methods for estimating tail probabilities of random sums of IMPH distributions; our algorithms combine importance sampling and conditional Monte Carlo methods. The empirical performance of each method suggested is explored via numerical experimentation.
{"title":"Estimating tail probabilities of random sums of infinite mixtures of phase-type distributions","authors":"H. Yao, L. Rojas-Nandayapa, T. Taimre","doi":"10.1109/WSC.2016.7822102","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822102","url":null,"abstract":"We consider the problem of estimating tail probabilities of random sums of infinite mixtures of phase-type (IMPH) distributions—a class of distributions corresponding to random variables which can be represented as a product of an arbitrary random variable with a classical phase-type distribution. Our motivation arises from applications in risk and queueing problems. Classical rare-event simulation algorithms cannot be implemented in this setting because these typically rely on the availability of the CDF or the MGF, but these are difficult to compute or not even available for the class of IMPH distributions. In this paper, we address these issues and propose alternative simulation methods for estimating tail probabilities of random sums of IMPH distributions; our algorithms combine importance sampling and conditional Monte Carlo methods. The empirical performance of each method suggested is explored via numerical experimentation.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131041915","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822168
A. Skabar
Given data in a matrix X in which rows represent vectors and columns comprise a mix of discrete and continuous variables, the method presented in this paper can be used to generate random vectors whose elements display the same marginal distributions and correlations as the variables in X. The data is represented as a bipartite graph consisting of object nodes (representing vectors) and attribute value nodes. Random walk can be used to estimate the distribution of a target variable conditioned on the remaining variables, allowing a random value to be drawn for that variable. This leads to the use of Gibbs sampling to generate entire vectors. Unlike conventional methods, the proposed method requires neither the joint distribution nor the correlations to be specified, learned, or modeled explicitly in any way. Application to the Australian Credit dataset demonstrates the feasibility of the approach in generating random vectors on challenging real-world datasets.
{"title":"Random vector generation from mixed-attribute datasets using random walk","authors":"A. Skabar","doi":"10.1109/WSC.2016.7822168","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822168","url":null,"abstract":"Given data in a matrix X in which rows represent vectors and columns comprise a mix of discrete and continuous variables, the method presented in this paper can be used to generate random vectors whose elements display the same marginal distributions and correlations as the variables in X. The data is represented as a bipartite graph consisting of object nodes (representing vectors) and attribute value nodes. Random walk can be used to estimate the distribution of a target variable conditioned on the remaining variables, allowing a random value to be drawn for that variable. This leads to the use of Gibbs sampling to generate entire vectors. Unlike conventional methods, the proposed method requires neither the joint distribution nor the correlations to be specified, learned, or modeled explicitly in any way. Application to the Australian Credit dataset demonstrates the feasibility of the approach in generating random vectors on challenging real-world datasets.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125497727","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-12-11DOI: 10.1109/WSC.2016.7822112
E. Salimi, A. Abbas
The modeling of complex service systems entails capturing many sub-components of the system, and the dependencies that exist among them in the form of a joint probability distribution. Two common methods for constructing joint probability distributions from experts using partial information include maximum entropy methods and copula methods. In this paper we explore the performance of these methods in capturing the dependence between random variables using correlation coefficients and lower-order pairwise assessments. We focus on the case of discrete random variables, and compare the performance of these methods using a Monte Carlo simulation when the variables exhibit both independence and non-linear dependence structures. We show that the maximum entropy method with correlation coefficients and the Gaussian copula method perform similarly, while the maximum entropy method with pairwise assessments performs better particularly when the variables exhibit non-linear dependence.
{"title":"A simulation-based comparison of maximum entropy and copula methods for capturing non-linear probability dependence","authors":"E. Salimi, A. Abbas","doi":"10.1109/WSC.2016.7822112","DOIUrl":"https://doi.org/10.1109/WSC.2016.7822112","url":null,"abstract":"The modeling of complex service systems entails capturing many sub-components of the system, and the dependencies that exist among them in the form of a joint probability distribution. Two common methods for constructing joint probability distributions from experts using partial information include maximum entropy methods and copula methods. In this paper we explore the performance of these methods in capturing the dependence between random variables using correlation coefficients and lower-order pairwise assessments. We focus on the case of discrete random variables, and compare the performance of these methods using a Monte Carlo simulation when the variables exhibit both independence and non-linear dependence structures. We show that the maximum entropy method with correlation coefficients and the Gaussian copula method perform similarly, while the maximum entropy method with pairwise assessments performs better particularly when the variables exhibit non-linear dependence.","PeriodicalId":367269,"journal":{"name":"2016 Winter Simulation Conference (WSC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123005269","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}