Researchers at the National Institute of Standards and Technology have proposed the development of neutral libraries of simulation components. The availability of such libraries would simplify the generation of simulation models, enable component-based modeling, and speed Internet-based simulation services. The result would be a reduction in the complexity of simulation modeling and analysis. We consider a discrete-event simulation of the flow of jobs through a job shop. We describe the information requirements for the components in that simulation and provide formal models based on those requirements. We then derive a database structure from these formal models and discuss the population of that database with the data entries for a sample job shop. Finally, we examine the translators we developed to go from the neutral representation of the simulation components to the representation required by a commercial simulation package.
美国国家标准与技术研究所(National Institute of Standards and Technology)的研究人员提议开发中立的仿真组件库。这些库的可用性将简化仿真模型的生成,支持基于组件的建模,并加快基于internet的仿真服务。其结果将是降低仿真建模和分析的复杂性。我们考虑了通过作业车间的作业流的离散事件模拟。我们描述了模拟中组件的信息需求,并根据这些需求提供了正式的模型。然后,我们从这些形式化模型中派生出一个数据库结构,并讨论该数据库的填充和一个示例作业车间的数据条目。最后,我们检查了我们开发的翻译器,以从模拟组件的中立表示转换为商业模拟包所需的表示。
{"title":"Automatic generation of simulation models from neutral libraries: an example","authors":"Y. Son, Albert T. Jones, R. Wysk","doi":"10.1109/WSC.2000.899140","DOIUrl":"https://doi.org/10.1109/WSC.2000.899140","url":null,"abstract":"Researchers at the National Institute of Standards and Technology have proposed the development of neutral libraries of simulation components. The availability of such libraries would simplify the generation of simulation models, enable component-based modeling, and speed Internet-based simulation services. The result would be a reduction in the complexity of simulation modeling and analysis. We consider a discrete-event simulation of the flow of jobs through a job shop. We describe the information requirements for the components in that simulation and provide formal models based on those requirements. We then derive a database structure from these formal models and discuss the population of that database with the data entries for a sample job shop. Finally, we examine the translators we developed to go from the neutral representation of the simulation components to the representation required by a commercial simulation package.","PeriodicalId":333727,"journal":{"name":"2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)","volume":"127 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115235079","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yuh-Chyun Luo, Chun-Hung Chen, E. Yücesan, Insup Lee
Web technology is having a significant impact on computer simulation. Most of the effort in Web based simulation is aimed at modeling, particularly at building simulation languages and at creating model libraries that can be assembled and executed over the Web. We focus on the efficiency of simulation experimentation for optimization. We introduce a framework for combining the statistical efficiency of simulation optimization techniques with the effectiveness of parallel execution algorithms. In particular, the Optimal Computing Budget Allocation (OCBA) algorithm is implemented in a Web based environment for low-cost parallel and distributed simulation experimentation. A prototype implementation with some experimental results is presented.
{"title":"Distributed Web-based simulation optimization","authors":"Yuh-Chyun Luo, Chun-Hung Chen, E. Yücesan, Insup Lee","doi":"10.1109/WSC.2000.899170","DOIUrl":"https://doi.org/10.1109/WSC.2000.899170","url":null,"abstract":"Web technology is having a significant impact on computer simulation. Most of the effort in Web based simulation is aimed at modeling, particularly at building simulation languages and at creating model libraries that can be assembled and executed over the Web. We focus on the efficiency of simulation experimentation for optimization. We introduce a framework for combining the statistical efficiency of simulation optimization techniques with the effectiveness of parallel execution algorithms. In particular, the Optimal Computing Budget Allocation (OCBA) algorithm is implemented in a Web based environment for low-cost parallel and distributed simulation experimentation. A prototype implementation with some experimental results is presented.","PeriodicalId":333727,"journal":{"name":"2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115289894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The paper presents a method that uses initial sample data to choose between statistical procedures for identifying the simulated system with the best (maximum or minimum) expected performance. The method chooses the procedure that minimizes the additional number of simulation replications required to return a pre-specified probability guarantee. This problem may be encountered after a heuristic search procedure has been applied in a simulation-optimization context. In this setting, initial samples from each system may already have been taken, but because of stochastic variation, the system with the best sample mean at the end of the search procedure may not be the true best system encountered during the search. Empirical work in previous papers suggests that the relative number of additional replications required by existing procedures depends on factors such as the configuration of the systems' means and their variances that may be unknown prior to initial data collection. These results motivated the approach taken in the paper, where we postpone the choice between statistical procedures until after observing the initial data.
{"title":"Adaptively choosing the best procedure for selecting the best system","authors":"Justin Boesel","doi":"10.1109/WSC.2000.899761","DOIUrl":"https://doi.org/10.1109/WSC.2000.899761","url":null,"abstract":"The paper presents a method that uses initial sample data to choose between statistical procedures for identifying the simulated system with the best (maximum or minimum) expected performance. The method chooses the procedure that minimizes the additional number of simulation replications required to return a pre-specified probability guarantee. This problem may be encountered after a heuristic search procedure has been applied in a simulation-optimization context. In this setting, initial samples from each system may already have been taken, but because of stochastic variation, the system with the best sample mean at the end of the search procedure may not be the true best system encountered during the search. Empirical work in previous papers suggests that the relative number of additional replications required by existing procedures depends on factors such as the configuration of the systems' means and their variances that may be unknown prior to initial data collection. These results motivated the approach taken in the paper, where we postpone the choice between statistical procedures until after observing the initial data.","PeriodicalId":333727,"journal":{"name":"2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115668171","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The theory of modeling and simulation is well defined as a result of about 30 years of research and practice. There are commonly accepted approaches and methods of working out successful simulation studies. The educational aspects of simulation are very complicated: there is no common accepted curriculum, nor a basic textbook. The situation with the Web is much worse (mainly as a result of missing time-resources): the quality of the teaching material concerning simulation is very heterogeneous and does not fit well. Often the work of producing teaching materials and exercises is done twice. The goal of the paper is to present a real working database system for managing links and generating collections of simulation related material for teaching and learning purposes.
{"title":"A virtual textbook for modeling and simulation","authors":"T. Wiedemann","doi":"10.1109/WSC.2000.899153","DOIUrl":"https://doi.org/10.1109/WSC.2000.899153","url":null,"abstract":"The theory of modeling and simulation is well defined as a result of about 30 years of research and practice. There are commonly accepted approaches and methods of working out successful simulation studies. The educational aspects of simulation are very complicated: there is no common accepted curriculum, nor a basic textbook. The situation with the Web is much worse (mainly as a result of missing time-resources): the quality of the teaching material concerning simulation is very heterogeneous and does not fit well. Often the work of producing teaching materials and exercises is done twice. The goal of the paper is to present a real working database system for managing links and generating collections of simulation related material for teaching and learning purposes.","PeriodicalId":333727,"journal":{"name":"2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123132528","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We discuss some of the difficulties present in trace collection and trace-driven cache simulation. We then describe our multiprocessor tracing technique and verify that it accurately collects long traces. We propose sampling as a method to reduce required disk space, enable simulations to run faster, and effectively enlarge the trace buffer of our hardware monitor, decreasing trace distortion. To this end, we investigate time sampling and two types of set sampling. We conclude that the second set sampling technique achieves the most accurate results. The miss rate for the second set sampling method is calculated as the number of misses to sampled sets divided by the total number of references scaled by the sample size. We determined that a 10% sample size was the most accurate while still reducing required disk space.
{"title":"Facilitating level three cache studies using set sampling","authors":"Niki C. Thornock, J. Flanagan","doi":"10.1109/WSC.2000.899754","DOIUrl":"https://doi.org/10.1109/WSC.2000.899754","url":null,"abstract":"We discuss some of the difficulties present in trace collection and trace-driven cache simulation. We then describe our multiprocessor tracing technique and verify that it accurately collects long traces. We propose sampling as a method to reduce required disk space, enable simulations to run faster, and effectively enlarge the trace buffer of our hardware monitor, decreasing trace distortion. To this end, we investigate time sampling and two types of set sampling. We conclude that the second set sampling technique achieves the most accurate results. The miss rate for the second set sampling method is calculated as the number of misses to sampled sets divided by the total number of references scaled by the sample size. We determined that a 10% sample size was the most accurate while still reducing required disk space.","PeriodicalId":333727,"journal":{"name":"2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)","volume":"162 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116904925","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The purpose of the paper is to introduce a new approach to teaching an introductory simulation course using an interactive CD-ROM titled "Simply Simulation". This method utilizes several multimedia tools and a hypertext based Web format. The simulation literature currently shows no studies on this proposed new teaching method. Course structure, requirements, and benefits of Simply Simulation are described. Simply Simulation gives detailed explanations on simulation concepts and easy-to-follow instructions in five modules. The student uses Taylor II process simulation software to model and analyze progressively more complex real life situations. Competencies gained are measured via a pretest at the beginning of each module and a quiz at the end of each module. The paper and Simply Simulation contribute to the simulation education literature by exemplifying how to enhance the learning effectiveness by utilizing various information technologies and teaching methods.
{"title":"Simply Simulation: an interactive CD-ROM-based approach for learning simulation concepts","authors":"C. Nott, Graham Nott, C. C. Lee","doi":"10.1109/WSC.2000.899159","DOIUrl":"https://doi.org/10.1109/WSC.2000.899159","url":null,"abstract":"The purpose of the paper is to introduce a new approach to teaching an introductory simulation course using an interactive CD-ROM titled \"Simply Simulation\". This method utilizes several multimedia tools and a hypertext based Web format. The simulation literature currently shows no studies on this proposed new teaching method. Course structure, requirements, and benefits of Simply Simulation are described. Simply Simulation gives detailed explanations on simulation concepts and easy-to-follow instructions in five modules. The student uses Taylor II process simulation software to model and analyze progressively more complex real life situations. Competencies gained are measured via a pretest at the beginning of each module and a quiz at the end of each module. The paper and Simply Simulation contribute to the simulation education literature by exemplifying how to enhance the learning effectiveness by utilizing various information technologies and teaching methods.","PeriodicalId":333727,"journal":{"name":"2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116932038","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The decision-making process is a very essential part of any construction operation. Simulation can be used as a tool to assist construction managers in making informed decisions. In this paper, simulation is applied to a concrete batch plant to analyze alternative solutions and resource management. Data is collected to define activity durations for the plant. A simulation model is constructed for the plant using the Micro CYCLONE simulation system. Based on sensitivity analysis, management tools are constructed to help the decision-maker. These tools are a time-cost-quantity chart, a feasible region analysis and a contour lines chart. Time-cost-quantity and contour lines charts are used for deciding production time, production cost and required resources for a required distance from the plant. The feasible region chart is used for deciding the range of alternative solutions that can be taken to minimize production time and cost of the available plant resources according to the required transportation distance.
{"title":"Simulation as a tool for resource management","authors":"T. Zayed, D. Halpin","doi":"10.1109/WSC.2000.899184","DOIUrl":"https://doi.org/10.1109/WSC.2000.899184","url":null,"abstract":"The decision-making process is a very essential part of any construction operation. Simulation can be used as a tool to assist construction managers in making informed decisions. In this paper, simulation is applied to a concrete batch plant to analyze alternative solutions and resource management. Data is collected to define activity durations for the plant. A simulation model is constructed for the plant using the Micro CYCLONE simulation system. Based on sensitivity analysis, management tools are constructed to help the decision-maker. These tools are a time-cost-quantity chart, a feasible region analysis and a contour lines chart. Time-cost-quantity and contour lines charts are used for deciding production time, production cost and required resources for a required distance from the plant. The feasible region chart is used for deciding the range of alternative solutions that can be taken to minimize production time and cost of the available plant resources according to the required transportation distance.","PeriodicalId":333727,"journal":{"name":"2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125043386","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Monte Carlo simulation is a popular method for pricing financial options and other derivative securities because of the availability of powerful workstations and recent advances in applying the tool. The existence of easy-to-use software makes simulation accessible to many users who would otherwise avoid programming the algorithms necessary to value derivative securities. This paper presents examples of option pricing and variance reduction, and demonstrates their implementation with Crystal Ball 2000, a spreadsheet simulation add-in program.
{"title":"Using simulation for option pricing","authors":"J. Charnes","doi":"10.1109/WSC.2000.899710","DOIUrl":"https://doi.org/10.1109/WSC.2000.899710","url":null,"abstract":"Monte Carlo simulation is a popular method for pricing financial options and other derivative securities because of the availability of powerful workstations and recent advances in applying the tool. The existence of easy-to-use software makes simulation accessible to many users who would otherwise avoid programming the algorithms necessary to value derivative securities. This paper presents examples of option pricing and variance reduction, and demonstrates their implementation with Crystal Ball 2000, a spreadsheet simulation add-in program.","PeriodicalId":333727,"journal":{"name":"2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125886592","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We develop and evaluate algorithms for generating random variates for simulation input. One group called automatic, or black-box algorithms can be used to sample from distributions with known density. They are based on the rejection principle. The hat function is generated automatically in a setup step using the idea of transformed density rejection. There, the density is transformed into a concave function and the minimum of several tangents is used to construct the hat function. The resulting algorithms are not too complicated and are quite fast. The principle is also applicable to random vectors. A second group of algorithms is presented that generate random variates directly from a given sample by implicitly estimating the unknown distribution. The best of these algorithms are based on the idea of naive resampling plus added noise. These algorithms can be interpreted as sampling from the kernel density estimates. This method can be also applied to random vectors. There, it can be interpreted as a mixture of naive resampling and sampling from the multi-normal distribution that has the same covariance matrix as the data. The algorithms described in the paper have been implemented in ANSI C in a library called UNURAN which is available via anonymous ftp.
{"title":"Automatic random variate generation for simulation input","authors":"W. Hörmann, J. Leydold","doi":"10.1109/WSC.2000.899779","DOIUrl":"https://doi.org/10.1109/WSC.2000.899779","url":null,"abstract":"We develop and evaluate algorithms for generating random variates for simulation input. One group called automatic, or black-box algorithms can be used to sample from distributions with known density. They are based on the rejection principle. The hat function is generated automatically in a setup step using the idea of transformed density rejection. There, the density is transformed into a concave function and the minimum of several tangents is used to construct the hat function. The resulting algorithms are not too complicated and are quite fast. The principle is also applicable to random vectors. A second group of algorithms is presented that generate random variates directly from a given sample by implicitly estimating the unknown distribution. The best of these algorithms are based on the idea of naive resampling plus added noise. These algorithms can be interpreted as sampling from the kernel density estimates. This method can be also applied to random vectors. There, it can be interpreted as a mixture of naive resampling and sampling from the multi-normal distribution that has the same covariance matrix as the data. The algorithms described in the paper have been implemented in ANSI C in a library called UNURAN which is available via anonymous ftp.","PeriodicalId":333727,"journal":{"name":"2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126174143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Once a simulation model is developed, designed experiments may be employed to efficiently optimize the system. Designed experiments are used on "real" production systems as well. The first step is to screen for important independent variables. Several screening methods are compared and contrasted in terms of efficiency, effectiveness and robustness. These screening methods range from the classical factorial designs and two-stage group screening to new, more novel designs including sequential bifurcation (SB) and iterated fractional factorial designs (IFFD). Conditions for the use of the methods are provided along with references on how to use them.
{"title":"Finding important independent variables through screening designs: a comparison of methods","authors":"Linda Trocine, L. Malone","doi":"10.1109/WSC.2000.899789","DOIUrl":"https://doi.org/10.1109/WSC.2000.899789","url":null,"abstract":"Once a simulation model is developed, designed experiments may be employed to efficiently optimize the system. Designed experiments are used on \"real\" production systems as well. The first step is to screen for important independent variables. Several screening methods are compared and contrasted in terms of efficiency, effectiveness and robustness. These screening methods range from the classical factorial designs and two-stage group screening to new, more novel designs including sequential bifurcation (SB) and iterated fractional factorial designs (IFFD). Conditions for the use of the methods are provided along with references on how to use them.","PeriodicalId":333727,"journal":{"name":"2000 Winter Simulation Conference Proceedings (Cat. No.00CH37165)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125471227","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}