Pub Date : 2012-12-09DOI: 10.1109/WSC.2012.6465207
K. Fordyce, Jonathan Fournier, R. Milne, Harpal Singh
Since the early 1990s, organizations have focused on making smarter decisions in their integrated supply chain central planning, but the representation of capacity and cycle time has remained static and linear in contrast to its complex nature. This includes central planning for firms with semiconductor fabrication facilities (FABS) as a component of a complex demand supply network(DSN) where much of the complexity is non-FAB. Developing more intelligent solutions for capacity in central planning within computational and process limitations is a critical challenge. For DSNs with FABS, twin challenges are tool deployment and the operating curve. Many in the FAB community are aware of these complexities; options proposed and some implemented within “aggregate FAB planning,” rarely within central planning. This tutorial reviews the current state of central planning with respect to capacity and cycle time, outlines the challenges these complexities place on central planning structures, and indicates possible solutions.
{"title":"Tutorial: Illusion of capacity — Challenge of incorporating the complexity of FAB capacity (tool deployment & operating curve) into central planning for firms with substantial non-fab complexity","authors":"K. Fordyce, Jonathan Fournier, R. Milne, Harpal Singh","doi":"10.1109/WSC.2012.6465207","DOIUrl":"https://doi.org/10.1109/WSC.2012.6465207","url":null,"abstract":"Since the early 1990s, organizations have focused on making smarter decisions in their integrated supply chain central planning, but the representation of capacity and cycle time has remained static and linear in contrast to its complex nature. This includes central planning for firms with semiconductor fabrication facilities (FABS) as a component of a complex demand supply network(DSN) where much of the complexity is non-FAB. Developing more intelligent solutions for capacity in central planning within computational and process limitations is a critical challenge. For DSNs with FABS, twin challenges are tool deployment and the operating curve. Many in the FAB community are aware of these complexities; options proposed and some implemented within “aggregate FAB planning,” rarely within central planning. This tutorial reviews the current state of central planning with respect to capacity and cycle time, outlines the challenges these complexities place on central planning structures, and indicates possible solutions.","PeriodicalId":320728,"journal":{"name":"Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC)","volume":"84 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115138548","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-12-09DOI: 10.1109/WSC.2012.6465159
Nipesh Pradhananga, J. Teizer
This paper presents an automated GPS-based method for assessing construction equipment operations productivity. The literature revealed several shortcomings in simulation of construction equipment, for example, the availability of realistic data that supports a simulation framework, and identified the need for integrating real-time field data into simulations. Commercially available GPS-based data logging technology was then evaluated. Analysis methods and rules for monitoring productivity were also discussed. A software interface was created that allowed to analyze and visualize several important parameters towards creating more realistic simulation models. The experimental results showed a productivity assessment method by collecting spatio-temporal data using GPS data logging technology, applied to construction equipment operations, and finally identified and tracked productivity and safety based information for job site layout decision making. This research aids construction project managers in decision making for planning work tasks, hazard identification, and worker training by providing realistic and real-time project equipment operation information.
{"title":"GPS-based framework towards more realistic and real-time construction equipment operation simulation","authors":"Nipesh Pradhananga, J. Teizer","doi":"10.1109/WSC.2012.6465159","DOIUrl":"https://doi.org/10.1109/WSC.2012.6465159","url":null,"abstract":"This paper presents an automated GPS-based method for assessing construction equipment operations productivity. The literature revealed several shortcomings in simulation of construction equipment, for example, the availability of realistic data that supports a simulation framework, and identified the need for integrating real-time field data into simulations. Commercially available GPS-based data logging technology was then evaluated. Analysis methods and rules for monitoring productivity were also discussed. A software interface was created that allowed to analyze and visualize several important parameters towards creating more realistic simulation models. The experimental results showed a productivity assessment method by collecting spatio-temporal data using GPS data logging technology, applied to construction equipment operations, and finally identified and tracked productivity and safety based information for job site layout decision making. This research aids construction project managers in decision making for planning work tasks, hazard identification, and worker training by providing realistic and real-time project equipment operation information.","PeriodicalId":320728,"journal":{"name":"Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115360484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-12-09DOI: 10.1109/WSC.2012.6465121
T. Bountourelis, David J. Eckman, K. L. Luangkesorn, A. Schaefer, Spencer G. Nabors, G. Clermont
The modeling and simulation of inpatient healthcare systems comprising of multiple interconnected units of monitored care is a challenging task given the nature of clinical practices and procedures that regulate patient flow. Therefore, any related study on the properties of patient flow should (i) explicitly consider the modeling of patient movement rules in face of congestion, and (ii) examine the sensitivity of simulation output, expressed by patient delays and diversions, over different patient movement modeling approaches. In this work, we use a high fidelity simulation model of a tertiary facility that can incorporate complex patient movement rules to investigate the challenges inherent in its employment for resource allocation tasks.
{"title":"Sensitivity analysis of an ICU simulation model","authors":"T. Bountourelis, David J. Eckman, K. L. Luangkesorn, A. Schaefer, Spencer G. Nabors, G. Clermont","doi":"10.1109/WSC.2012.6465121","DOIUrl":"https://doi.org/10.1109/WSC.2012.6465121","url":null,"abstract":"The modeling and simulation of inpatient healthcare systems comprising of multiple interconnected units of monitored care is a challenging task given the nature of clinical practices and procedures that regulate patient flow. Therefore, any related study on the properties of patient flow should (i) explicitly consider the modeling of patient movement rules in face of congestion, and (ii) examine the sensitivity of simulation output, expressed by patient delays and diversions, over different patient movement modeling approaches. In this work, we use a high fidelity simulation model of a tertiary facility that can incorporate complex patient movement rules to investigate the challenges inherent in its employment for resource allocation tasks.","PeriodicalId":320728,"journal":{"name":"Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115494924","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-12-09DOI: 10.1109/WSC.2012.6465002
S. Lorenz, Matthias Hesse, A. Fischer
One way to improve the energy efficiency in manufacturing is the use of energy-sensitive methods in production planning. So far, the energy consumption behavior of production facilities has not been investigated in great detail. Estimates are typically obtained by connected wattage values and concurrency factors. We present a new methodology to simulate and optimize complex robot driven production systems with special emphasis on energy aspects. In particular, we show how to translate the process descriptions and energy consumption profiles into a discrete-event-based simulation model and illustrate this with an example of a car body shop facility. In order to minimize the peak-load we set up an optimization model that is based on periodic time-expanded networks. A solution of this model corresponds to a process sequence for the robots that prescribes relative starting times via additional wait intervals. This sequence is then reinserted into the simulation model to validate the improvement.
{"title":"Simulation and optimization of robot driven production systems for peak-load reduction","authors":"S. Lorenz, Matthias Hesse, A. Fischer","doi":"10.1109/WSC.2012.6465002","DOIUrl":"https://doi.org/10.1109/WSC.2012.6465002","url":null,"abstract":"One way to improve the energy efficiency in manufacturing is the use of energy-sensitive methods in production planning. So far, the energy consumption behavior of production facilities has not been investigated in great detail. Estimates are typically obtained by connected wattage values and concurrency factors. We present a new methodology to simulate and optimize complex robot driven production systems with special emphasis on energy aspects. In particular, we show how to translate the process descriptions and energy consumption profiles into a discrete-event-based simulation model and illustrate this with an example of a car body shop facility. In order to minimize the peak-load we set up an optimization model that is based on periodic time-expanded networks. A solution of this model corresponds to a process sequence for the robots that prescribes relative starting times via additional wait intervals. This sequence is then reinserted into the simulation model to validate the improvement.","PeriodicalId":320728,"journal":{"name":"Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116802936","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-12-09DOI: 10.1109/WSC.2012.6464990
M. Yokouchi, Setsuko Aoki, Haixia Sang, R. Zhao, S. Takakuwa
With current increasing demands for outpatient medical services, healthcare providers have had to analyze methods for providing safety and quality care services under constrained resources. A long waiting-time has a severe impact not only on the patients' satisfaction but also on the physical conditions of the patients, who receive invasive treatment in ambulatory care facilities. Patients treated with outpatient chemotherapy have been rapidly increasing over the last decade in Japan. In this context, a discrete event simulation model for exploring appointment scheduling in an outpatient chemotherapy department of a general hospital was developed. An efficient schedule was identified that held bed utilization to a tolerance level by restraining the excess waiting-time in a clinical setting. It is suggested that a scheduling method based on the infusion time be available for the outpatient chemotherapy department.
{"title":"Operations analysis and appointment scheduling for an outpatient chemotherapy department","authors":"M. Yokouchi, Setsuko Aoki, Haixia Sang, R. Zhao, S. Takakuwa","doi":"10.1109/WSC.2012.6464990","DOIUrl":"https://doi.org/10.1109/WSC.2012.6464990","url":null,"abstract":"With current increasing demands for outpatient medical services, healthcare providers have had to analyze methods for providing safety and quality care services under constrained resources. A long waiting-time has a severe impact not only on the patients' satisfaction but also on the physical conditions of the patients, who receive invasive treatment in ambulatory care facilities. Patients treated with outpatient chemotherapy have been rapidly increasing over the last decade in Japan. In this context, a discrete event simulation model for exploring appointment scheduling in an outpatient chemotherapy department of a general hospital was developed. An efficient schedule was identified that held bed utilization to a tolerance level by restraining the excess waiting-time in a clinical setting. It is suggested that a scheduling method based on the infusion time be available for the outpatient chemotherapy department.","PeriodicalId":320728,"journal":{"name":"Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117066344","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-12-09DOI: 10.1109/WSC.2012.6465059
Claudia Szabo, Y. M. Teo
Emergent properties are becoming increasingly important as systems grow in size and complexity. Despite recent research interest in understanding emergent behavior, practical approaches remain a key challenge. This paper proposes an integrated approach for the identification of emergence with two perspectives. A post-mortem emergence analysis requires a-priori knowledge about emergence and can identify the causes of emergent behavior. In contrast, a live analysis, in which emergence is identified as it happens, does not require prior knowledge and relies on a more rigorous definition of individual model components in terms of what they achieve, rather than how. Our proposed approach integrates reconstructability analysis in the validation of emergence included in our proposed component-based model development life-cycle.
{"title":"An integrated approach for the validation of emergence in component-based simulation models","authors":"Claudia Szabo, Y. M. Teo","doi":"10.1109/WSC.2012.6465059","DOIUrl":"https://doi.org/10.1109/WSC.2012.6465059","url":null,"abstract":"Emergent properties are becoming increasingly important as systems grow in size and complexity. Despite recent research interest in understanding emergent behavior, practical approaches remain a key challenge. This paper proposes an integrated approach for the identification of emergence with two perspectives. A post-mortem emergence analysis requires a-priori knowledge about emergence and can identify the causes of emergent behavior. In contrast, a live analysis, in which emergence is identified as it happens, does not require prior knowledge and relies on a more rigorous definition of individual model components in terms of what they achieve, rather than how. Our proposed approach integrates reconstructability analysis in the validation of emergence included in our proposed component-based model development life-cycle.","PeriodicalId":320728,"journal":{"name":"Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127105465","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-12-09DOI: 10.1109/WSC.2012.6465111
C. Osorio, H. Bidkhori
This paper presents a simulation-based optimization (SO) algorithm for nonlinear problems with general constraints and computationally expensive evaluation of objective functions. It focuses on metamodel techniques. This paper proposes an SO technique that also uses metamodel information when testing the improvement of the proposed points. We use a Bayesian framework, where the parameters of the prior distributions are estimated based on probabilistic metamodel information. In order to derive an SO algorithm that achieves a good trade-off between detail, realism and computational efficiency, the metamodel combines information from a high-resolution simulator with information from a lower-resolution yet computationally efficient analytical differentiable network model. In this paper, we use the probabilistic information from the queueing model to estimate the parameters of the prior distributions. We evaluate the performance of this SO algorithm by addressing an urban traffic management problem using a detailed microscopic traffic simulator of the Swiss city of Lausanne.
{"title":"Combining metamodel techniques and Bayesian selection procedures to derive computationally efficient simulation-based optimization algorithms","authors":"C. Osorio, H. Bidkhori","doi":"10.1109/WSC.2012.6465111","DOIUrl":"https://doi.org/10.1109/WSC.2012.6465111","url":null,"abstract":"This paper presents a simulation-based optimization (SO) algorithm for nonlinear problems with general constraints and computationally expensive evaluation of objective functions. It focuses on metamodel techniques. This paper proposes an SO technique that also uses metamodel information when testing the improvement of the proposed points. We use a Bayesian framework, where the parameters of the prior distributions are estimated based on probabilistic metamodel information. In order to derive an SO algorithm that achieves a good trade-off between detail, realism and computational efficiency, the metamodel combines information from a high-resolution simulator with information from a lower-resolution yet computationally efficient analytical differentiable network model. In this paper, we use the probabilistic information from the queueing model to estimate the parameters of the prior distributions. We evaluate the performance of this SO algorithm by addressing an urban traffic management problem using a detailed microscopic traffic simulator of the Swiss city of Lausanne.","PeriodicalId":320728,"journal":{"name":"Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127212070","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-12-09DOI: 10.1109/WSC.2012.6465046
David Scerri, Sarah L. Hickmott, L. Padgham
Agent based simulations often model humans and increasingly it is necessary to do this at an appropriate level of complexity. It has been suggested that the Belief Desire Intention (BDI) paradigm is suitable for modeling the cognitive processes of agents representing (some of) the humans in an agent based modeling simulation. This approach models agents as having goals, and reacting to events, with high level plans, or plan types, that are gradually refined as situations unfold. This is an intuitive approach for modeling human cognitive processes. However, it is important that users can understand, verify and even contribute to the model being used. We describe a tool that can be used to explore, understand and modify, the BDI model of an agent's cognitive processes within a simulation. The tool is interactive and allows users to explore options available (and not available) at a particular agent decision point.
{"title":"User understanding of cognitive processes in simulation: A tool for exploring and modifying","authors":"David Scerri, Sarah L. Hickmott, L. Padgham","doi":"10.1109/WSC.2012.6465046","DOIUrl":"https://doi.org/10.1109/WSC.2012.6465046","url":null,"abstract":"Agent based simulations often model humans and increasingly it is necessary to do this at an appropriate level of complexity. It has been suggested that the Belief Desire Intention (BDI) paradigm is suitable for modeling the cognitive processes of agents representing (some of) the humans in an agent based modeling simulation. This approach models agents as having goals, and reacting to events, with high level plans, or plan types, that are gradually refined as situations unfold. This is an intuitive approach for modeling human cognitive processes. However, it is important that users can understand, verify and even contribute to the model being used. We describe a tool that can be used to explore, understand and modify, the BDI model of an agent's cognitive processes within a simulation. The tool is interactive and allows users to explore options available (and not available) at a particular agent decision point.","PeriodicalId":320728,"journal":{"name":"Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127549551","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-12-09DOI: 10.1109/WSC.2012.6465036
Qi Liu, Márcio Silva, M. R. Hines, D. D. Silva
To address the challenge of automated performance benchmarking in virtualized cloud infrastructures, an extensible and adaptable framework called CloudBench has been developed to conduct scalable, controllable, and repeatable experiments in such environments. This paper presents the hardware-in-the-loop simulation technique used in CloudBench, which integrates an efficient discrete-event simulation with the cloud infrastructure under test in a closed feedback control loop. The technique supports the decomposition of complex resource usage patterns and provides a mechanism for statistically multiplexing application requests of varied characteristics to generate realistic and emergent behavior. It also exploits parallelism at multiple levels to improve simulation efficiency, while maintaining temporal and causal relationships with proper synchronization. Our experiments demonstrate that the proposed technique can synthesize complex resource usage behavior for effective cloud performance benchmarking.
{"title":"Hardware-in-the-loop simulation for automated benchmarking of cloud infrastructures","authors":"Qi Liu, Márcio Silva, M. R. Hines, D. D. Silva","doi":"10.1109/WSC.2012.6465036","DOIUrl":"https://doi.org/10.1109/WSC.2012.6465036","url":null,"abstract":"To address the challenge of automated performance benchmarking in virtualized cloud infrastructures, an extensible and adaptable framework called CloudBench has been developed to conduct scalable, controllable, and repeatable experiments in such environments. This paper presents the hardware-in-the-loop simulation technique used in CloudBench, which integrates an efficient discrete-event simulation with the cloud infrastructure under test in a closed feedback control loop. The technique supports the decomposition of complex resource usage patterns and provides a mechanism for statistically multiplexing application requests of varied characteristics to generate realistic and emergent behavior. It also exploits parallelism at multiple levels to improve simulation efficiency, while maintaining temporal and causal relationships with proper synchronization. Our experiments demonstrate that the proposed technique can synthesize complex resource usage behavior for effective cloud performance benchmarking.","PeriodicalId":320728,"journal":{"name":"Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC)","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124864618","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-12-09DOI: 10.1109/WSC.2012.6465244
J. Marin, P. Pudlo, M. Sedki
Approximate Bayesian Computation (ABC) methods have a lot of success to accomplish Bayesian inference on the parameters of models for which the calculation of the likelihood is intractable. These algorithms consists in comparing the observed dataset to many simulated datasets. These ones can be generated in different ways. Typically, the rejection ABC scheme consists first of simulating parameters using independent calls to the prior distribution and then, given these values, generating the datasets using independent calls to the model. For such a method, the computation time needed to get a suitable approximation of the posterior distribution can be very long. Also, there exist some sequential Monte Carlo methods replacing simulations from the prior by using successive approximations to the posterior distribution. Here, we recall a sequential simulation algorithm and we compare different parallelization strategies. We notably shown that the parallelization of the sequential ABC sampler is useless when using more than four threads per instance of the program and that the standard rejection ABC sampler has to be used when facing a large number of cores. Indeed, in such a case, the cost of the sequential ABC sampler's parallelization prohibits its use.
{"title":"Optimal parallelization of a sequential approximate Bayesian computation algorithm","authors":"J. Marin, P. Pudlo, M. Sedki","doi":"10.1109/WSC.2012.6465244","DOIUrl":"https://doi.org/10.1109/WSC.2012.6465244","url":null,"abstract":"Approximate Bayesian Computation (ABC) methods have a lot of success to accomplish Bayesian inference on the parameters of models for which the calculation of the likelihood is intractable. These algorithms consists in comparing the observed dataset to many simulated datasets. These ones can be generated in different ways. Typically, the rejection ABC scheme consists first of simulating parameters using independent calls to the prior distribution and then, given these values, generating the datasets using independent calls to the model. For such a method, the computation time needed to get a suitable approximation of the posterior distribution can be very long. Also, there exist some sequential Monte Carlo methods replacing simulations from the prior by using successive approximations to the posterior distribution. Here, we recall a sequential simulation algorithm and we compare different parallelization strategies. We notably shown that the parallelization of the sequential ABC sampler is useless when using more than four threads per instance of the program and that the standard rejection ABC sampler has to be used when facing a large number of cores. Indeed, in such a case, the cost of the sequential ABC sampler's parallelization prohibits its use.","PeriodicalId":320728,"journal":{"name":"Proceedings Title: Proceedings of the 2012 Winter Simulation Conference (WSC)","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124907306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}