Pub Date : 2011-07-01DOI: 10.1080/1941658x.2011.631853
{"title":"Acknowledgment of Reviewers' Services","authors":"","doi":"10.1080/1941658x.2011.631853","DOIUrl":"https://doi.org/10.1080/1941658x.2011.631853","url":null,"abstract":"","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121065875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-07-01DOI: 10.1080/1941658X.2011.585331
Timothy P. Anderson
Many cost estimating problems involve determining the number of units to build in a yield-constrained manufacturing process, when it takes, on average, n attempts to produce m successes (m ≤ n). Examples include computer chips, focal plane arrays, circuit boards, field programmable gate arrays, etc. The simplistic approach to this problem is to multiply the number of units needed, m, by the expected number of attempts needed to produce a single success, n. For example, if a contractor reports that it takes, on average, 10 attempts to build one working unit, and if four such units are needed for a space-borne application, then the simplistic approach would be to plan for 4 × 10 = 40 units, and estimate the cost accordingly. However, if the cost analyst uses the simplistic approach, he or she is likely to be disappointed, as the probability that 40 attempts will actually produce four working units is only about 57%. Consequently, there is a 43% probability that 40 attempts will be insufficient. In fact, if the analyst wants to have, say, 80% confidence that four working units will be available, then he/she should plan for 54 attempts! Obviously, this could have a huge impact on the cost estimate. The purpose of this research is to describe the nature of the problem, to justify modeling the problem in terms of a negative binomial random variable, and to develop the necessary thought process that one must go through in order to adequately determine the number of units to build given a desired level of confidence. This understanding will be of great benefit to cost analysts who are in the position of estimating costs when certain hardware elements behave as described previously. The technique will also be very useful in cost uncertainty analysis, enabling the cost analyst to determine the appropriate probability distribution for the number of units needed to achieve success in their programs.
{"title":"A Probabilistic Approach to Determining the Number of Units to Build in a Yield-Constrained Process","authors":"Timothy P. Anderson","doi":"10.1080/1941658X.2011.585331","DOIUrl":"https://doi.org/10.1080/1941658X.2011.585331","url":null,"abstract":"Many cost estimating problems involve determining the number of units to build in a yield-constrained manufacturing process, when it takes, on average, n attempts to produce m successes (m ≤ n). Examples include computer chips, focal plane arrays, circuit boards, field programmable gate arrays, etc. The simplistic approach to this problem is to multiply the number of units needed, m, by the expected number of attempts needed to produce a single success, n. For example, if a contractor reports that it takes, on average, 10 attempts to build one working unit, and if four such units are needed for a space-borne application, then the simplistic approach would be to plan for 4 × 10 = 40 units, and estimate the cost accordingly. However, if the cost analyst uses the simplistic approach, he or she is likely to be disappointed, as the probability that 40 attempts will actually produce four working units is only about 57%. Consequently, there is a 43% probability that 40 attempts will be insufficient. In fact, if the analyst wants to have, say, 80% confidence that four working units will be available, then he/she should plan for 54 attempts! Obviously, this could have a huge impact on the cost estimate. The purpose of this research is to describe the nature of the problem, to justify modeling the problem in terms of a negative binomial random variable, and to develop the necessary thought process that one must go through in order to adequately determine the number of units to build given a desired level of confidence. This understanding will be of great benefit to cost analysts who are in the position of estimating costs when certain hardware elements behave as described previously. The technique will also be very useful in cost uncertainty analysis, enabling the cost analyst to determine the appropriate probability distribution for the number of units needed to achieve success in their programs.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125202035","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-07-01DOI: 10.1080/1941658X.2011.629494
Mike Ross
A significant challenge that many cost analysts and project managers face is predicting by how much their initial estimates of software development cost and schedule will change over the lifecycle of the project. Examination of currently-accepted software cost, schedule, and defect estimation algorithms reveals a common acknowledgment that estimated software size is the single most influential independent variable. Unfortunately, the most important business decisions about a software project are made at its beginning, the time when most estimating is done, and coincidently the time of minimum knowledge, maximum uncertainty, and hysterical optimism. This article describes a model and methodology that provides probabilistic growth adjustment to single-point Technical Baseline Estimates of Delivered Source Lines of Code, for both new software and pre-existing reused software that is sensitive to the maturity of their single-point estimates. The model is based on Software Resources Data Report data collected by the U.S. Air Force and has been used as part of the basis for several USAF program office estimates and independent cost estimates. It provides an alternative to other software code growth methodologies, such as Holchin's and Jensen's code growth matrices.
{"title":"A Probabilistic Method for Predicting Software Code Growth","authors":"Mike Ross","doi":"10.1080/1941658X.2011.629494","DOIUrl":"https://doi.org/10.1080/1941658X.2011.629494","url":null,"abstract":"A significant challenge that many cost analysts and project managers face is predicting by how much their initial estimates of software development cost and schedule will change over the lifecycle of the project. Examination of currently-accepted software cost, schedule, and defect estimation algorithms reveals a common acknowledgment that estimated software size is the single most influential independent variable. Unfortunately, the most important business decisions about a software project are made at its beginning, the time when most estimating is done, and coincidently the time of minimum knowledge, maximum uncertainty, and hysterical optimism. This article describes a model and methodology that provides probabilistic growth adjustment to single-point Technical Baseline Estimates of Delivered Source Lines of Code, for both new software and pre-existing reused software that is sensitive to the maturity of their single-point estimates. The model is based on Software Resources Data Report data collected by the U.S. Air Force and has been used as part of the basis for several USAF program office estimates and independent cost estimates. It provides an alternative to other software code growth methodologies, such as Holchin's and Jensen's code growth matrices.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125057230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-01-01DOI: 10.1080/1941658X.2011.585333
Stephen A. Book, M. Broder, D. Feldman
Traditional development of cost-estimating relationships (CERs) has been based on “full” data sets consisting of all available cost and technical data associated with a particular class of products of interest, e.g., components, subsystems or entire systems of satellites, ground systems, etc. In this article, we review an extension of the concept of “analogy estimating” to parametric estimating, namely the concept of “adaptive” CERs—CERs that are based on specific knowledge of individual data points that may be more relevant to a particular estimating problem than would the full data set. The goal of adaptive CER development is to be able to apply CERs that have smaller estimating error and narrower prediction bounds. Several examples of adaptive CERs were provided in a presentation (Book & Broder, 2008) by the first two authors to the May 2008 SSCAG Meeting in Noordwijk, Holland, and the June 2008 SCEA/ISPA Conference in Industry Hills, CA. This article focuses on statistical foundations of the derivation of adaptive CERs, namely, the method of weighted least-squares regression. Ordinary least-squares regression has been traditionally applied to historical-cost data in order to derive additive-error CERs valid over an entire data range, subject to the requirement that all data points be weighted equally and have residuals that are distributed according to a common normal distribution. The idea behind adaptive CERs, however, is that data points should be “deweighted” based on some function of their distance from the point at which an estimate is to be made. This means that each historical data point should be assigned a “weight” that reflects its importance to the particular estimation that is to be made using the derived CER. This presentation describes technical details of the weighted least-squares derivation process, resulting quality metrics, and the roles it plays in adaptive-CER development.
{"title":"Statistical Foundations of Adaptive Cost-Estimating Relationships","authors":"Stephen A. Book, M. Broder, D. Feldman","doi":"10.1080/1941658X.2011.585333","DOIUrl":"https://doi.org/10.1080/1941658X.2011.585333","url":null,"abstract":"Traditional development of cost-estimating relationships (CERs) has been based on “full” data sets consisting of all available cost and technical data associated with a particular class of products of interest, e.g., components, subsystems or entire systems of satellites, ground systems, etc. In this article, we review an extension of the concept of “analogy estimating” to parametric estimating, namely the concept of “adaptive” CERs—CERs that are based on specific knowledge of individual data points that may be more relevant to a particular estimating problem than would the full data set. The goal of adaptive CER development is to be able to apply CERs that have smaller estimating error and narrower prediction bounds. Several examples of adaptive CERs were provided in a presentation (Book & Broder, 2008) by the first two authors to the May 2008 SSCAG Meeting in Noordwijk, Holland, and the June 2008 SCEA/ISPA Conference in Industry Hills, CA. This article focuses on statistical foundations of the derivation of adaptive CERs, namely, the method of weighted least-squares regression. Ordinary least-squares regression has been traditionally applied to historical-cost data in order to derive additive-error CERs valid over an entire data range, subject to the requirement that all data points be weighted equally and have residuals that are distributed according to a common normal distribution. The idea behind adaptive CERs, however, is that data points should be “deweighted” based on some function of their distance from the point at which an estimate is to be made. This means that each historical data point should be assigned a “weight” that reflects its importance to the particular estimation that is to be made using the derived CER. This presentation describes technical details of the weighted least-squares derivation process, resulting quality metrics, and the roles it plays in adaptive-CER development.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128281447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-01-01DOI: 10.1080/1941658X.2011.585335
Roy E. Smoker
This article uses earned value management trend analysis to forecast trends in BAC and BCWP. The resulting equations are then used to solve for the expected month at completion. With the month at completion date in hand, the article uses trend analysis to find the EAC at that month along with the BAC at that month far in the future to solve for VAC. By using variance against a baseline, the article shows how much risk this program will incur by the date at completion. A monthly risk burndown chart is developed to illustrate how the program burns down risk during life of the program. It indicates that the rate of risk burndown may very well be more rapid than the rate of accomplishment of remaining work. The article concludes that program managers would be well advised to require analysis of EVM trends to understand how much additional schedule is being added to a contract with each addition of scope as measured by the increase in BAC over time.
{"title":"Use of Earned Value Management Trends to Forecast Cost Risks","authors":"Roy E. Smoker","doi":"10.1080/1941658X.2011.585335","DOIUrl":"https://doi.org/10.1080/1941658X.2011.585335","url":null,"abstract":"This article uses earned value management trend analysis to forecast trends in BAC and BCWP. The resulting equations are then used to solve for the expected month at completion. With the month at completion date in hand, the article uses trend analysis to find the EAC at that month along with the BAC at that month far in the future to solve for VAC. By using variance against a baseline, the article shows how much risk this program will incur by the date at completion. A monthly risk burndown chart is developed to illustrate how the program burns down risk during life of the program. It indicates that the rate of risk burndown may very well be more rapid than the rate of accomplishment of remaining work. The article concludes that program managers would be well advised to require analysis of EVM trends to understand how much additional schedule is being added to a contract with each addition of scope as measured by the increase in BAC over time.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"102 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128577101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-01-01DOI: 10.1080/1941658X.2011.585336
Bohdan L. Kaluzny, Sorin Barbici, Göran Berg, Renzo Chiomento, Dimitrios Derpanis, Ulf J. Jonsson, R. H. A. D. Shaw, M. Smit, Franck Ramaroson
This article presents a novel application of known data mining algorithms to the problem of estimating the cost of ship development and construction. The work is a product of North Atlantic Treaty Organization Research and Technology Organization Systems Analysis and Studies 076 Task Group “NATO Independent Cost Estimating and its Role in Capability Portfolio Analysis.” In a blind, ex post exercise, the Task Group set out to estimate the cost of a class of Netherlands' amphibious assault ships, and then compare the estimates to the actual costs (the Netherlands Royal Navy withheld the actual ship costs until the exercise was completed). Two cost estimating approaches were taken: parametric analysis and costing by analogy. For the parametric approach, the M5 system (a combination of decision trees and linear regression models) of Quinlan (1992) for learning models that predict numeric values was employed. Agglomerative hierarchical cluster analysis and non-linear optimization was used for a cost estimation by analogy approach void of subjectivity.
{"title":"An Application of Data Mining Algorithms for Shipbuilding Cost Estimation","authors":"Bohdan L. Kaluzny, Sorin Barbici, Göran Berg, Renzo Chiomento, Dimitrios Derpanis, Ulf J. Jonsson, R. H. A. D. Shaw, M. Smit, Franck Ramaroson","doi":"10.1080/1941658X.2011.585336","DOIUrl":"https://doi.org/10.1080/1941658X.2011.585336","url":null,"abstract":"This article presents a novel application of known data mining algorithms to the problem of estimating the cost of ship development and construction. The work is a product of North Atlantic Treaty Organization Research and Technology Organization Systems Analysis and Studies 076 Task Group “NATO Independent Cost Estimating and its Role in Capability Portfolio Analysis.” In a blind, ex post exercise, the Task Group set out to estimate the cost of a class of Netherlands' amphibious assault ships, and then compare the estimates to the actual costs (the Netherlands Royal Navy withheld the actual ship costs until the exercise was completed). Two cost estimating approaches were taken: parametric analysis and costing by analogy. For the parametric approach, the M5 system (a combination of decision trees and linear regression models) of Quinlan (1992) for learning models that predict numeric values was employed. Agglomerative hierarchical cluster analysis and non-linear optimization was used for a cost estimation by analogy approach void of subjectivity.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125410614","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-01-01DOI: 10.1080/1941658X.2011.585329
Dale Shermon
This article describes three alternative approaches to historical trend analysis. First, the study considers the trend over time of the complexities of past systems. This results from the application of a parametric cost model (PRICE H) to the normalisation of historical projects' costs and to the plotting of the complexity over time. Second, the trend over time of the equipment production cost, which has been observed as ‘the cost of successive generations of equipment to continue to rise at above the rate of inflation,’ commonly referred to as ‘Defence equipment cost growth.’ Finally, an analysis of technology over time through the application of multi-variable, forward step-wise regression (true concepts methodology)—one of the variables in the regression analysis being the cost residual versus time representing the cost of technology growth. The article describes the advantages and disadvantages of each historical trends analysis method. The research study indicates when each method might be applicable and in what circumstances it is dangerous to consider their usage. A case study has been used to consider the effect and accuracy of each of the methods. This review has considered the historical trend for a particular system and predicted the future cost of a possible acquisition. The objective of the study is to stimulate discussion amongst the cost community as to the usage of historical trends analysis, a common term that has not matured in many ways. The historical trends analysis technique is transferable and equally applicable to commercial or government organisations wishing to predict their own costs.
{"title":"Historical Trend Analysis Analysed","authors":"Dale Shermon","doi":"10.1080/1941658X.2011.585329","DOIUrl":"https://doi.org/10.1080/1941658X.2011.585329","url":null,"abstract":"This article describes three alternative approaches to historical trend analysis. First, the study considers the trend over time of the complexities of past systems. This results from the application of a parametric cost model (PRICE H) to the normalisation of historical projects' costs and to the plotting of the complexity over time. Second, the trend over time of the equipment production cost, which has been observed as ‘the cost of successive generations of equipment to continue to rise at above the rate of inflation,’ commonly referred to as ‘Defence equipment cost growth.’ Finally, an analysis of technology over time through the application of multi-variable, forward step-wise regression (true concepts methodology)—one of the variables in the regression analysis being the cost residual versus time representing the cost of technology growth. The article describes the advantages and disadvantages of each historical trends analysis method. The research study indicates when each method might be applicable and in what circumstances it is dangerous to consider their usage. A case study has been used to consider the effect and accuracy of each of the methods. This review has considered the historical trend for a particular system and predicted the future cost of a possible acquisition. The objective of the study is to stimulate discussion amongst the cost community as to the usage of historical trends analysis, a common term that has not matured in many ways. The historical trends analysis technique is transferable and equally applicable to commercial or government organisations wishing to predict their own costs.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"39 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126077619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-01-01DOI: 10.1080/1941658x.2011.585337
{"title":"Acknowledgment of Reviewers' Services","authors":"","doi":"10.1080/1941658x.2011.585337","DOIUrl":"https://doi.org/10.1080/1941658x.2011.585337","url":null,"abstract":"","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115190527","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-01DOI: 10.1080/1941658X.2010.10462234
Stephen A. Book
Abstract Prior to formal program initiation, analysts typically undertake trade studies to investigate which of several candidate architectures or designs can best provide a desired capability at minimum cost. The various candidates, however, typically differ significantly in risk and uncertainty as well as in cost, but members of the government or industry trade-study team do not have the time and the candidate solutions usually aren't sufficiently detailed at this stage to allow a thorough risk analysis to be conducted. Yet, those differences in risk and uncertainty, as well as in cost, should be taken into account to the extent possible during the trade-study decision process. Because timeliness and simplicity are key requirements of analyses undertaken in support of trade studies, what usually happens is that a “point” cost estimate, or perhaps a 50%-confidence estimate, is established for each candidate, and the go-ahead decision is made based on that estimate. A nagging question remains: “What if Candidate A, the lower-cost option based on those estimates, faces risk issues that make its 80th-percentile cost higher than that of Candidate B?” In other words, Candidate B would be the lower-cost option if the cost comparison were made at the 80% confidence level. This situation is classic, where the decision maker must choose between a low-cost, high-risk option and a high-cost, low-risk option. This article offers a methodology that allows the program manager to take account of all risk scenarios by making use of all cost percentiles simultaneously, namely the entire cost probability distribution of each candidate not simply the point estimate or the 80% confidence cost. As it turns out, the expression of system cost in terms of a lognormal or simulation-generated probability distribution makes it possible to estimate the probability that each candidate will turn out to be the least costly of all the options, and probabilities of that kind are the basis on which an informed decision can be made.
{"title":"Cost Risk as a Discriminator in Trade Studies","authors":"Stephen A. Book","doi":"10.1080/1941658X.2010.10462234","DOIUrl":"https://doi.org/10.1080/1941658X.2010.10462234","url":null,"abstract":"Abstract Prior to formal program initiation, analysts typically undertake trade studies to investigate which of several candidate architectures or designs can best provide a desired capability at minimum cost. The various candidates, however, typically differ significantly in risk and uncertainty as well as in cost, but members of the government or industry trade-study team do not have the time and the candidate solutions usually aren't sufficiently detailed at this stage to allow a thorough risk analysis to be conducted. Yet, those differences in risk and uncertainty, as well as in cost, should be taken into account to the extent possible during the trade-study decision process. Because timeliness and simplicity are key requirements of analyses undertaken in support of trade studies, what usually happens is that a “point” cost estimate, or perhaps a 50%-confidence estimate, is established for each candidate, and the go-ahead decision is made based on that estimate. A nagging question remains: “What if Candidate A, the lower-cost option based on those estimates, faces risk issues that make its 80th-percentile cost higher than that of Candidate B?” In other words, Candidate B would be the lower-cost option if the cost comparison were made at the 80% confidence level. This situation is classic, where the decision maker must choose between a low-cost, high-risk option and a high-cost, low-risk option. This article offers a methodology that allows the program manager to take account of all risk scenarios by making use of all cost percentiles simultaneously, namely the entire cost probability distribution of each candidate not simply the point estimate or the 80% confidence cost. As it turns out, the expression of system cost in terms of a lognormal or simulation-generated probability distribution makes it possible to estimate the probability that each candidate will turn out to be the least costly of all the options, and probabilities of that kind are the basis on which an informed decision can be made.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"96 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121831870","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-11-01DOI: 10.1080/1941658X.2010.10462232
N. Hulkower
Abstract Exact Probabilities by Simulation, a method introduced by Book (2010a) for determining which candidate in a trade study is the probable lowest-cost alternative, is extended to ensure that all available data generated are used. The Borda Count, the only “non-dictatorial” positional voting method that satisfies the criteria for a rational decision procedure while using the complete information, is then applied to determine the rank ordering of the alternatives. The extended method, called “Exact Probabilities by Simulation with Borda,” gives the most reliable outcome; it yields a ranking that easily blends with those based on other criteria to select the best alternative in a trade study.
Book (2010a)引入的精确概率模拟方法用于确定贸易研究中哪个候选方案可能是成本最低的替代方案,该方法被扩展到确保使用所有生成的可用数据。Borda Count是唯一的“非独裁”位置投票方法,它在使用完整信息的情况下满足理性决策过程的标准,然后应用它来确定备选方案的排名顺序。这种扩展的方法被称为“Borda精确概率模拟”,它给出了最可靠的结果;它产生的排名很容易与基于其他标准的排名混合在一起,以在贸易研究中选择最佳替代方案。
{"title":"The Probable Lowest-Cost Alternative According to Borda","authors":"N. Hulkower","doi":"10.1080/1941658X.2010.10462232","DOIUrl":"https://doi.org/10.1080/1941658X.2010.10462232","url":null,"abstract":"Abstract Exact Probabilities by Simulation, a method introduced by Book (2010a) for determining which candidate in a trade study is the probable lowest-cost alternative, is extended to ensure that all available data generated are used. The Borda Count, the only “non-dictatorial” positional voting method that satisfies the criteria for a rational decision procedure while using the complete information, is then applied to determine the rank ordering of the alternatives. The extended method, called “Exact Probabilities by Simulation with Borda,” gives the most reliable outcome; it yields a ranking that easily blends with those based on other criteria to select the best alternative in a trade study.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"108 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127943562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}