Pub Date : 2013-12-08DOI: 10.1109/WSC.2013.6721553
Wayne Zandbergen
There is a wide range of opinion regarding historical and theoretical causes of bank panics and financial crises. Current theory, and theory-based models, find little support in the historical record. This paper examines previous empirical findings based in detailed banking records and offers several new results based on detailed bank data from 1893 Helena, Montana. These findings suggest modeling bank panics as psycho-social events. The Bank Depositor Model (BDM) builds upon a model previously designed to examine emotions within a group (Bosse et al., 2009). BDM represents bank depositor behavior as resulting from a combination of heterogeneous agent (depositor) attributes, views expressed by those in an agents social network and exogenous events that may alter an agents receptiveness to positive or negative views. Initial results conform with the described empirical facts.
关于银行恐慌和金融危机的历史和理论原因,众说纷纭。目前的理论,以及基于理论的模型,在历史记录中几乎找不到支持。本文根据详细的银行记录考察了以前的实证发现,并根据蒙大拿州海伦娜1893年的详细银行数据提供了几个新的结果。这些发现建议将银行恐慌建模为心理社会事件。银行存款人模型(BDM)建立在先前设计用于检查群体情绪的模型之上(Bosse et al., 2009)。BDM表示银行存款人的行为是由异质代理(存款人)属性、代理社会网络中那些人表达的观点以及可能改变代理接受积极或消极观点的外生事件的组合所导致的。初步结果与所描述的经验事实相符。
{"title":"An empirically-grounded simulation of bank depositors","authors":"Wayne Zandbergen","doi":"10.1109/WSC.2013.6721553","DOIUrl":"https://doi.org/10.1109/WSC.2013.6721553","url":null,"abstract":"There is a wide range of opinion regarding historical and theoretical causes of bank panics and financial crises. Current theory, and theory-based models, find little support in the historical record. This paper examines previous empirical findings based in detailed banking records and offers several new results based on detailed bank data from 1893 Helena, Montana. These findings suggest modeling bank panics as psycho-social events. The Bank Depositor Model (BDM) builds upon a model previously designed to examine emotions within a group (Bosse et al., 2009). BDM represents bank depositor behavior as resulting from a combination of heterogeneous agent (depositor) attributes, views expressed by those in an agents social network and exogenous events that may alter an agents receptiveness to positive or negative views. Initial results conform with the described empirical facts.","PeriodicalId":223717,"journal":{"name":"2013 Winter Simulations Conference (WSC)","volume":"49 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131789690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-08DOI: 10.1109/WSC.2013.6721603
D. Morrice, D. Wang, J. Bard, Luci K. Leykum, S. Noorily, Poornachand Veerapaneni
The process of preparing patients for outpatient surgery is information intensive. However, medical records are often fragmented among different providers and systems. As a result, the preoperative assessment process is frequently prolonged by missing information, potentially leading to surgery delay or cancellation. In this study, we simulate an anesthesiology pre-operative assessment clinic to quantify the impact of patient information deficiency and to assist in the development of a patient-centered surgical home to mitigate this problem through better system-wide coordination.
{"title":"A simulation analysis of a patient-centered surgical home to improve outpatient surgical processes of care and outcomes","authors":"D. Morrice, D. Wang, J. Bard, Luci K. Leykum, S. Noorily, Poornachand Veerapaneni","doi":"10.1109/WSC.2013.6721603","DOIUrl":"https://doi.org/10.1109/WSC.2013.6721603","url":null,"abstract":"The process of preparing patients for outpatient surgery is information intensive. However, medical records are often fragmented among different providers and systems. As a result, the preoperative assessment process is frequently prolonged by missing information, potentially leading to surgery delay or cancellation. In this study, we simulate an anesthesiology pre-operative assessment clinic to quantify the impact of patient information deficiency and to assist in the development of a patient-centered surgical home to mitigate this problem through better system-wide coordination.","PeriodicalId":223717,"journal":{"name":"2013 Winter Simulations Conference (WSC)","volume":"220 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134545460","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-08DOI: 10.1109/WSC.2013.6721487
J. Kleijnen, E. Mehdad
A classic Kriging or Gaussian process (GP) metamodel estimates the variance of its predictor by plugging-in the estimated GP (hyper)parameters; namely, the mean, variance, and covariances. The problem is that this predictor variance is biased. To solve this problem for deterministic simulations, we propose “conditional simulation” (CS), which gives predictions at an old point that in all bootstrap samples equal the observed value. CS accounts for the randomness of the estimated GP parameters. We use the CS predictor variance in the “expected improvement” criterion of “efficient global optimization” (EGO). To quantify the resulting small-sample performance, we experiment with multi-modal test functions. Our main conclusion is that EGO with classic Kriging seems quite robust; EGO with CS only tends to perform better in expensive simulation with small samples.
{"title":"Conditional simulation for efficient global optimization","authors":"J. Kleijnen, E. Mehdad","doi":"10.1109/WSC.2013.6721487","DOIUrl":"https://doi.org/10.1109/WSC.2013.6721487","url":null,"abstract":"A classic Kriging or Gaussian process (GP) metamodel estimates the variance of its predictor by plugging-in the estimated GP (hyper)parameters; namely, the mean, variance, and covariances. The problem is that this predictor variance is biased. To solve this problem for deterministic simulations, we propose “conditional simulation” (CS), which gives predictions at an old point that in all bootstrap samples equal the observed value. CS accounts for the randomness of the estimated GP parameters. We use the CS predictor variance in the “expected improvement” criterion of “efficient global optimization” (EGO). To quantify the resulting small-sample performance, we experiment with multi-modal test functions. Our main conclusion is that EGO with classic Kriging seems quite robust; EGO with CS only tends to perform better in expensive simulation with small samples.","PeriodicalId":223717,"journal":{"name":"2013 Winter Simulations Conference (WSC)","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115820393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-08DOI: 10.1109/WSC.2013.6721561
Quinn D. Conley
Abandonment is a key indicator of performance and a driver of service level in a call center. Calls that abandon affect the wait times of the remaining calls in the queue and the ability of call center resources to service the remaining calls. This interaction is further complicated when the call center has multiple arrival channels, handled by two groups of resources with a shared pool of resources between them. In this case, a valid call center model necessitates a highly accurate method for modeling abandonment. This paper documents a unique application of Kaplan-Meier survival analysis to model call center abandonment in a discrete event simulation model. The paper also demonstrates the benefits of using KaplanMeier verses another approach.
{"title":"Simulating abandonment using Kaplan-Meier survival analysis in a shared billing and claims call center","authors":"Quinn D. Conley","doi":"10.1109/WSC.2013.6721561","DOIUrl":"https://doi.org/10.1109/WSC.2013.6721561","url":null,"abstract":"Abandonment is a key indicator of performance and a driver of service level in a call center. Calls that abandon affect the wait times of the remaining calls in the queue and the ability of call center resources to service the remaining calls. This interaction is further complicated when the call center has multiple arrival channels, handled by two groups of resources with a shared pool of resources between them. In this case, a valid call center model necessitates a highly accurate method for modeling abandonment. This paper documents a unique application of Kaplan-Meier survival analysis to model call center abandonment in a discrete event simulation model. The paper also demonstrates the benefits of using KaplanMeier verses another approach.","PeriodicalId":223717,"journal":{"name":"2013 Winter Simulations Conference (WSC)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124345984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-08DOI: 10.1109/WSC.2013.6721442
C. G. Corlu, B. Biller
This paper considers a stochastic system simulation with unknown input distribution parameters and assumes the availability of a limited amount of historical data for parameter estimation. We investigate how to account for parameter uncertainty - the uncertainty that is due to the estimation of the input distribution parameters from historical data of finite length - in a subset selection procedure that identifies the stochastic system designs whose sample means are within a user-specified distance of the best mean performance measure. We show that even when the number of simulation replications is large enough for the stochastic uncertainty to be negligible, the amount of parameter uncertainty in output data imposes a threshold on the user-specified distance for an effective use of the subset selection procedure for simulation. We demonstrate the significance of this effect of parameter uncertainty for a multi-item inventory system simulation in the presence of short demand histories.
{"title":"A subset selection procedure under input parameter uncertainty","authors":"C. G. Corlu, B. Biller","doi":"10.1109/WSC.2013.6721442","DOIUrl":"https://doi.org/10.1109/WSC.2013.6721442","url":null,"abstract":"This paper considers a stochastic system simulation with unknown input distribution parameters and assumes the availability of a limited amount of historical data for parameter estimation. We investigate how to account for parameter uncertainty - the uncertainty that is due to the estimation of the input distribution parameters from historical data of finite length - in a subset selection procedure that identifies the stochastic system designs whose sample means are within a user-specified distance of the best mean performance measure. We show that even when the number of simulation replications is large enough for the stochastic uncertainty to be negligible, the amount of parameter uncertainty in output data imposes a threshold on the user-specified distance for an effective use of the subset selection procedure for simulation. We demonstrate the significance of this effect of parameter uncertainty for a multi-item inventory system simulation in the presence of short demand histories.","PeriodicalId":223717,"journal":{"name":"2013 Winter Simulations Conference (WSC)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124607105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-08DOI: 10.1109/WSC.2013.6721666
L. F. Perrone, T. Henderson, M. J. Watrous, Vinicius Daly Felizardo
An important design decision in the construction of a simulator is how to enable users to access the data generated in each run of a simulation experiment. As the simulator executes, the samples of performance metrics that are generated beg to be exposed either in their raw state or after having undergone mathematical processing. Also of concern is the particular format this data assumes when externalized to mass storage, since it determines the ease of processing by other applications or interpretation by the user. In this paper, we present a framework for the ns-3 network simulator for capturing data from inside an experiment, subjecting it to mathematical transformations, and ultimately marshaling it into various output formats. The application of this functionality is illustrated and analyzed via a study of common use cases. Although the implementation of our approach is specific to ns-3, this design presents lessons transferrable to other platforms.
{"title":"The design of an output data collection framework for NS-3","authors":"L. F. Perrone, T. Henderson, M. J. Watrous, Vinicius Daly Felizardo","doi":"10.1109/WSC.2013.6721666","DOIUrl":"https://doi.org/10.1109/WSC.2013.6721666","url":null,"abstract":"An important design decision in the construction of a simulator is how to enable users to access the data generated in each run of a simulation experiment. As the simulator executes, the samples of performance metrics that are generated beg to be exposed either in their raw state or after having undergone mathematical processing. Also of concern is the particular format this data assumes when externalized to mass storage, since it determines the ease of processing by other applications or interpretation by the user. In this paper, we present a framework for the ns-3 network simulator for capturing data from inside an experiment, subjecting it to mathematical transformations, and ultimately marshaling it into various output formats. The application of this functionality is illustrated and analyzed via a study of common use cases. Although the implementation of our approach is specific to ns-3, this design presents lessons transferrable to other platforms.","PeriodicalId":223717,"journal":{"name":"2013 Winter Simulations Conference (WSC)","volume":"77 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134479074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-08DOI: 10.1109/WSC.2013.6721495
Honggang Wang
Optimal development of oil and gas fields involves determining well locations in oil reservoirs and well control through the production time. Field development problems are mixed-integer optimization problems because the well locations are dened by integer-valued block indices in the discrete reservoir model, while the well control variables such as bottom hole pressures or injection rates are continuous. Reservoir simulation software is used to evaluate production performance given a well placement and control plan. In the presence of reservoir uncertainty, we sample and simulate multiple model realizations to estimate the expected eld performance. We present a retrospective optimization using dynamic simplex interpolation (RODSI) algorithm for oil field development under uncertainty. The numerical results show that the RODSI algorithm efficiently finds a solution yielding a 20% increase (compared to a solution suggested from heuristics) in the expected net present value (NPV) over 30 years of reservoir production for the considered Brugge case.
{"title":"Mixed integer simulation optimization for petroleum field development under geological uncertainty","authors":"Honggang Wang","doi":"10.1109/WSC.2013.6721495","DOIUrl":"https://doi.org/10.1109/WSC.2013.6721495","url":null,"abstract":"Optimal development of oil and gas fields involves determining well locations in oil reservoirs and well control through the production time. Field development problems are mixed-integer optimization problems because the well locations are dened by integer-valued block indices in the discrete reservoir model, while the well control variables such as bottom hole pressures or injection rates are continuous. Reservoir simulation software is used to evaluate production performance given a well placement and control plan. In the presence of reservoir uncertainty, we sample and simulate multiple model realizations to estimate the expected eld performance. We present a retrospective optimization using dynamic simplex interpolation (RODSI) algorithm for oil field development under uncertainty. The numerical results show that the RODSI algorithm efficiently finds a solution yielding a 20% increase (compared to a solution suggested from heuristics) in the expected net present value (NPV) over 30 years of reservoir production for the considered Brugge case.","PeriodicalId":223717,"journal":{"name":"2013 Winter Simulations Conference (WSC)","volume":"171 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134633401","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-08DOI: 10.1109/WSC.2013.6721576
Hikaru Ichimura, S. Takakuwa
Recently, significant research interest has been focused on environmental management aimed at the sustainable development of enterprises and society while decreasing the impact of such development on the environment. Japanese companies have been developing a variety of approaches and strategies. Material flow cost accounting (MFCA) has been proposed as a generally applicable indicator of growth potential and corporate environmental impact. Many companies that have introduced MFCA could recognize previously unnoticed losses. In addition, MFCA is useful as a tool to evaluate environmental impact and draft improved, more cost-efficient manufacturing plans. This paper demonstrates that companies that introduce MFCA can improve decision-making procedures and advantageously alter their manufacturing methods in a manner that differs from the inventory reduction idea based on the traditional Toyota production system.
{"title":"Decision making on manufacturing system from the perspective of material flow cost accounting","authors":"Hikaru Ichimura, S. Takakuwa","doi":"10.1109/WSC.2013.6721576","DOIUrl":"https://doi.org/10.1109/WSC.2013.6721576","url":null,"abstract":"Recently, significant research interest has been focused on environmental management aimed at the sustainable development of enterprises and society while decreasing the impact of such development on the environment. Japanese companies have been developing a variety of approaches and strategies. Material flow cost accounting (MFCA) has been proposed as a generally applicable indicator of growth potential and corporate environmental impact. Many companies that have introduced MFCA could recognize previously unnoticed losses. In addition, MFCA is useful as a tool to evaluate environmental impact and draft improved, more cost-efficient manufacturing plans. This paper demonstrates that companies that introduce MFCA can improve decision-making procedures and advantageously alter their manufacturing methods in a manner that differs from the inventory reduction idea based on the traditional Toyota production system.","PeriodicalId":223717,"journal":{"name":"2013 Winter Simulations Conference (WSC)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133506391","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-08DOI: 10.1109/WSC.2013.6721581
A. Muñoz-Villamizar, J. Montoya-Torres, A. Juan, José Cáceres-Cruz
In most medium and large sized cities around the world, freight transportation operations might have a noticeable impact on urban traffic mobility as well as on city commercial activities. In order to reduce both traffic congestion and pollution levels, several initiatives have been traditionally implemented. One of the most common strategies concerns the allocation of urban distribution warehouses near the city center in order to consolidate freight delivery services. This paper considers the integrated problem of locating distribution centers in urban areas and the corresponding freight distribution (vehicle routing). The combined problem is solved by using a hybrid algorithm which employs Monte Carlo simulation to induce biased randomness into several stages of the optimization procedure. The approach is then validated using real-life data and comparing our results with results from other works already available in the existing literature.
{"title":"A simulation-based algorithm for the integrated location and routing problem in urban logistics","authors":"A. Muñoz-Villamizar, J. Montoya-Torres, A. Juan, José Cáceres-Cruz","doi":"10.1109/WSC.2013.6721581","DOIUrl":"https://doi.org/10.1109/WSC.2013.6721581","url":null,"abstract":"In most medium and large sized cities around the world, freight transportation operations might have a noticeable impact on urban traffic mobility as well as on city commercial activities. In order to reduce both traffic congestion and pollution levels, several initiatives have been traditionally implemented. One of the most common strategies concerns the allocation of urban distribution warehouses near the city center in order to consolidate freight delivery services. This paper considers the integrated problem of locating distribution centers in urban areas and the corresponding freight distribution (vehicle routing). The combined problem is solved by using a hybrid algorithm which employs Monte Carlo simulation to induce biased randomness into several stages of the optimization procedure. The approach is then validated using real-life data and comparing our results with results from other works already available in the existing literature.","PeriodicalId":223717,"journal":{"name":"2013 Winter Simulations Conference (WSC)","volume":"172 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132829392","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-08DOI: 10.1109/WSC.2013.6721728
Anna Rotondo, J. Geraghty, P. Young
When inspection economies are implemented in multi-product, multi-stage, parallel processing manufacturing systems, there exists a significant risk of losing control of the monitoring efficacy of the sampling strategy adopted. For a product-based sampling decision limited to a particular station in a production segment, the randomness of the departure process and the merging of different product flows at the machines of the different stations subvert the regularity of deterministic sampling. The risk of not regularly monitoring any machine in the segment can be measured in terms of maximum number of consecutive unsampled items. In this study, the distribution of this measure at sampling station machines is developed for a production scenario characterized by one monitored product and an unmonitored flow and compared with the behavior of the same measure at non-sampling station machines. The prediction models illustrated prove fundamental pragmatic tools for quality management involved in sampling strategy-related decisions.
{"title":"Quality risk analysis at sampling stations crossed by one monitored product and an unmonitored flow","authors":"Anna Rotondo, J. Geraghty, P. Young","doi":"10.1109/WSC.2013.6721728","DOIUrl":"https://doi.org/10.1109/WSC.2013.6721728","url":null,"abstract":"When inspection economies are implemented in multi-product, multi-stage, parallel processing manufacturing systems, there exists a significant risk of losing control of the monitoring efficacy of the sampling strategy adopted. For a product-based sampling decision limited to a particular station in a production segment, the randomness of the departure process and the merging of different product flows at the machines of the different stations subvert the regularity of deterministic sampling. The risk of not regularly monitoring any machine in the segment can be measured in terms of maximum number of consecutive unsampled items. In this study, the distribution of this measure at sampling station machines is developed for a production scenario characterized by one monitored product and an unmonitored flow and compared with the behavior of the same measure at non-sampling station machines. The prediction models illustrated prove fundamental pragmatic tools for quality management involved in sampling strategy-related decisions.","PeriodicalId":223717,"journal":{"name":"2013 Winter Simulations Conference (WSC)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115453384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}