Pub Date : 2012-07-01DOI: 10.1080/1941658X.2012.734757
P. Garvey, Brian M. Flynn, P. Braxton, Richard Lee
In 2006, the scenario-based method was introduced as an alternative to advanced statistical methods for generating measures of cost risk. Since then, enhancements to the scenario-based method have been made. These include integrating historical cost performance data into the scenario-based method's algorithms and providing a context for applying the scenario-based method from the perspective of the 2009 Weapon Systems Acquisition Reform Act. Together, these improvements define the enhanced the scenario-based method. The enhanced scenario-based method is a historical data-driven application of scenario-based method. This article presents enhanced scenario-based method theory, application, and implementation. With today's emphasis on affordability-based decision-making, the enhanced scenario-based method promotes realism in estimating program costs by providing an analytically traceable and defensible basis behind data-derived measures of risk and cost estimate confidence. In memory of Dr. Steve Book, nulli secundus, for his kindness and devotion, and for his invaluable comments and insights on an earlier draft.
{"title":"Enhanced Scenario-Based Method for Cost Risk Analysis: Theory, Application, and Implementation","authors":"P. Garvey, Brian M. Flynn, P. Braxton, Richard Lee","doi":"10.1080/1941658X.2012.734757","DOIUrl":"https://doi.org/10.1080/1941658X.2012.734757","url":null,"abstract":"In 2006, the scenario-based method was introduced as an alternative to advanced statistical methods for generating measures of cost risk. Since then, enhancements to the scenario-based method have been made. These include integrating historical cost performance data into the scenario-based method's algorithms and providing a context for applying the scenario-based method from the perspective of the 2009 Weapon Systems Acquisition Reform Act. Together, these improvements define the enhanced the scenario-based method. The enhanced scenario-based method is a historical data-driven application of scenario-based method. This article presents enhanced scenario-based method theory, application, and implementation. With today's emphasis on affordability-based decision-making, the enhanced scenario-based method promotes realism in estimating program costs by providing an analytically traceable and defensible basis behind data-derived measures of risk and cost estimate confidence. In memory of Dr. Steve Book, nulli secundus, for his kindness and devotion, and for his invaluable comments and insights on an earlier draft.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117350431","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-01-01DOI: 10.1080/1941658X.2012.682935
Stephen A. Book
Estimating the cost of a system under development is essentially trying to predict the future, which means that any such estimate contains uncertainty. When estimating using a costestimating relationship (CER), a portion of this uncertainty arises from the possibility that the cost-estimating form to which regression analysis is applied may be the incorrect one. That is, the data may have been fit to a linear form, but some curvilinear relationship may more appropriately model the data. Assuming the algebraic model being used is the correct one, the CER’s uncertainty is described by its standard error of the estimate (SEE), which is basically the standard deviation of errors made (residuals) in applying that CER to estimate the (known) costs of the systems comprising the historical database. The SEE depends primarily on the extent to which those (known) costs fit the CER that purports to model them. Finally, additional uncertainty associated with a specific CER arises from the location of the particular cost-driver value (x) within or without the range of cost-driver values for programs comprising the historical cost database. For example, if x were located near the center of the range of its historical values, the CER would provide a more precise measure of the element’s cost than if x were located toward the edges or even outside the data range. The total uncertainty of CER-based estimates is a combination of all sources of uncertainty. The first kind of uncertainty mentioned, which questions the particular CER shape involved, cannot be measured without redoing the regression analysis for a wide variety of algebraic and other kinds of CER forms. Once we have decided upon a definite CER form, the SEE, represented by only one number characteristic of the CER, is fairly easy to measure for any CER shape or error model using known algebraic formulas. The second kind of uncertainty associated with a specific CER, which assesses both the CER itself and the value of the cost-driving parameter, is more complicated, and the way to account for it is completely understood only in the case of classical linear regression, i.e., ordinary least squares (OLS). As a result, explicit formulas exist for “prediction intervals” that bound cost estimates based on CERs that have been derived by applying OLS to historical cost data. For CERs, even linear ones, derived by other statistical methods, there appears to be no general method of solution described in the theoretical statistical literature. This report illustrates the application of bootstrap statistical sampling, a 34-year-old statistical process (Casella, 2003), to the problem of estimating prediction bounds for multiplicative-error and other CERs derived by non-OLS methods. After the bootstrap method is shown to be capable of yielding prediction bounds that approximate the known OLS bounds fairly
{"title":"Prediction Bounds for General-Error-Regression Cost-Estimating Relationships","authors":"Stephen A. Book","doi":"10.1080/1941658X.2012.682935","DOIUrl":"https://doi.org/10.1080/1941658X.2012.682935","url":null,"abstract":"Estimating the cost of a system under development is essentially trying to predict the future, which means that any such estimate contains uncertainty. When estimating using a costestimating relationship (CER), a portion of this uncertainty arises from the possibility that the cost-estimating form to which regression analysis is applied may be the incorrect one. That is, the data may have been fit to a linear form, but some curvilinear relationship may more appropriately model the data. Assuming the algebraic model being used is the correct one, the CER’s uncertainty is described by its standard error of the estimate (SEE), which is basically the standard deviation of errors made (residuals) in applying that CER to estimate the (known) costs of the systems comprising the historical database. The SEE depends primarily on the extent to which those (known) costs fit the CER that purports to model them. Finally, additional uncertainty associated with a specific CER arises from the location of the particular cost-driver value (x) within or without the range of cost-driver values for programs comprising the historical cost database. For example, if x were located near the center of the range of its historical values, the CER would provide a more precise measure of the element’s cost than if x were located toward the edges or even outside the data range. The total uncertainty of CER-based estimates is a combination of all sources of uncertainty. The first kind of uncertainty mentioned, which questions the particular CER shape involved, cannot be measured without redoing the regression analysis for a wide variety of algebraic and other kinds of CER forms. Once we have decided upon a definite CER form, the SEE, represented by only one number characteristic of the CER, is fairly easy to measure for any CER shape or error model using known algebraic formulas. The second kind of uncertainty associated with a specific CER, which assesses both the CER itself and the value of the cost-driving parameter, is more complicated, and the way to account for it is completely understood only in the case of classical linear regression, i.e., ordinary least squares (OLS). As a result, explicit formulas exist for “prediction intervals” that bound cost estimates based on CERs that have been derived by applying OLS to historical cost data. For CERs, even linear ones, derived by other statistical methods, there appears to be no general method of solution described in the theoretical statistical literature. This report illustrates the application of bootstrap statistical sampling, a 34-year-old statistical process (Casella, 2003), to the problem of estimating prediction bounds for multiplicative-error and other CERs derived by non-OLS methods. After the bootstrap method is shown to be capable of yielding prediction bounds that approximate the known OLS bounds fairly","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128243365","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-01-01DOI: 10.1080/1941658X.2012.682949
H. Apgar, Neil Albert
Dr. Stephen A. Book, co-editor of the Journal of Cost Analysis and Parametrics, died at his home in Seal Beach, CA, on Tuesday, January 10, 2012. Steve touched the lives of many of us on a personal and professional level through his friendship, teaching, mentoring, leadership, and inspiration. He also left an indelible mark on the cost, schedule, and risk analysis professional community worldwide. Steve earned his Ph.D. in mathematics, with a concentration in probability and statistics, at the University of Oregon. He joined The Aerospace Corporation in 1980, where he worked on a wide variety of Air Force programs and directed a vigorous program of research analysis into methods of conducting cost and schedule risk analyses and deriving cost estimating relationships (CERs). He went on to serve as Director, Cost and Requirements Analysis from 1989 to 1995 and then held one of the most eminent titles The Aerospace Corporation would bestow, the title of “Distinguished Engineer” from 1996 to 2000. Steve joined MCR in January 2001 and served as Chief Technical Officer from 2001 to 2009. Dr. Christian Smart (JCAP Managing Editor), who worked for Steve at MCR, remembers Steve as a pioneering researcher, a prolific writer and editor, and one of the icons of our profession. “Those of us who saw him lecture at a seminar or at a conference remember him as a gifted and witty speaker. Those of us who knew Steve personally remember him fondly for his self-deprecating wit, warmth, and patience as a teacher. Steve may be gone, but his legacy continues not only through his published papers, but also in the lives that he touched. I am fortunate to have known Steve and to have worked for him. Steve was my primary mentor in cost analysis, and I miss him.” Dr. Book was the recipient of ISPA’s Freiman Award for Lifetime Achievement. In 2010, he was the recipient of the SCEA Lifetime Achievement Award. He is one of only four individuals to receive both lifetime achievement awards. At the SCEA Awards ceremony, Dick Coleman (SCEA Regional VP) delivered a warm and personal accolade when he stressed how “ . . . the facts do not explain the deep respect and high regard that all in the cost community feel for Steve. Steve is a giant in the field, but unlike many eminent people, Steve is as unaffected and warm a person as you will ever meet. He hasn’t a trace of pomp or pretense. He is known for his iconoclastic style and his innovation, but nobody ever minded Steve’s view on things, even when it was their ox he was goring,
2012年1月10日,星期二,《成本分析与参数学杂志》的联合编辑Stephen A. Book博士在他位于加州海豹滩的家中去世。史蒂夫通过他的友谊、教导、指导、领导和激励,在个人和职业层面上感动了我们许多人的生活。他还在全球成本、进度和风险分析专业社区留下了不可磨灭的印记。史蒂夫在俄勒冈大学(University of Oregon)获得数学博士学位,主攻概率和统计。他于1980年加入航空航天公司,在那里他参与了各种各样的空军项目,并指导了一个充满活力的研究分析项目,该项目涉及进行成本和进度风险分析以及得出成本估算关系(CERs)的方法。从1989年到1995年,他担任成本和需求分析总监,然后从1996年到2000年,他担任the Aerospace Corporation将授予的最杰出的头衔之一,“杰出工程师”的头衔。Steve于2001年1月加入MCR,并于2001年至2009年担任首席技术官。曾在MCR为史蒂夫工作过的克里斯蒂安·斯玛特博士(JCAP执行主编)回忆说,史蒂夫是一位开创性的研究员,一位多产的作家和编辑,也是我们这个行业的偶像之一。“我们这些在研讨会或会议上看过他演讲的人都记得他是一个天才和机智的演讲者。我们这些认识史蒂夫的人都对他自嘲的智慧、热情和作为老师的耐心留下了深刻的印象。史蒂夫可能已经离开了,但他的遗产不仅通过他发表的论文继续存在,而且还存在于他所感动的生活中。我很幸运能认识史蒂夫并为他工作。史蒂夫是我在成本分析方面的主要导师,我很想念他。”布克博士曾获得ISPA的弗里曼终身成就奖。2010年,他获得了SCEA终身成就奖。他是仅有的四位同时获得终身成就奖的人之一。在SCEA颁奖典礼上,迪克·科尔曼(SCEA区域副总裁)发表了一个温暖和个人的荣誉,他强调如何“…这些事实并不能解释成本社区所有人对史蒂夫的深切尊重和高度重视。史蒂夫是这个领域的巨人,但与许多知名人士不同,史蒂夫是你所见过的最不做作、最温暖的人。他没有丝毫的浮夸和做作。他以打破传统的风格和创新精神而闻名,但没有人在意史蒂夫对事物的看法,即使他是在践踏他们的牛,
{"title":"A Tribute to Dr. Stephen A. Book (1941–2012)","authors":"H. Apgar, Neil Albert","doi":"10.1080/1941658X.2012.682949","DOIUrl":"https://doi.org/10.1080/1941658X.2012.682949","url":null,"abstract":"Dr. Stephen A. Book, co-editor of the Journal of Cost Analysis and Parametrics, died at his home in Seal Beach, CA, on Tuesday, January 10, 2012. Steve touched the lives of many of us on a personal and professional level through his friendship, teaching, mentoring, leadership, and inspiration. He also left an indelible mark on the cost, schedule, and risk analysis professional community worldwide. Steve earned his Ph.D. in mathematics, with a concentration in probability and statistics, at the University of Oregon. He joined The Aerospace Corporation in 1980, where he worked on a wide variety of Air Force programs and directed a vigorous program of research analysis into methods of conducting cost and schedule risk analyses and deriving cost estimating relationships (CERs). He went on to serve as Director, Cost and Requirements Analysis from 1989 to 1995 and then held one of the most eminent titles The Aerospace Corporation would bestow, the title of “Distinguished Engineer” from 1996 to 2000. Steve joined MCR in January 2001 and served as Chief Technical Officer from 2001 to 2009. Dr. Christian Smart (JCAP Managing Editor), who worked for Steve at MCR, remembers Steve as a pioneering researcher, a prolific writer and editor, and one of the icons of our profession. “Those of us who saw him lecture at a seminar or at a conference remember him as a gifted and witty speaker. Those of us who knew Steve personally remember him fondly for his self-deprecating wit, warmth, and patience as a teacher. Steve may be gone, but his legacy continues not only through his published papers, but also in the lives that he touched. I am fortunate to have known Steve and to have worked for him. Steve was my primary mentor in cost analysis, and I miss him.” Dr. Book was the recipient of ISPA’s Freiman Award for Lifetime Achievement. In 2010, he was the recipient of the SCEA Lifetime Achievement Award. He is one of only four individuals to receive both lifetime achievement awards. At the SCEA Awards ceremony, Dick Coleman (SCEA Regional VP) delivered a warm and personal accolade when he stressed how “ . . . the facts do not explain the deep respect and high regard that all in the cost community feel for Steve. Steve is a giant in the field, but unlike many eminent people, Steve is as unaffected and warm a person as you will ever meet. He hasn’t a trace of pomp or pretense. He is known for his iconoclastic style and his innovation, but nobody ever minded Steve’s view on things, even when it was their ox he was goring,","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"366 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133981858","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-01-01DOI: 10.1080/1941658x.2012.696963
{"title":"Acknowledgment of Reviewers' Services","authors":"","doi":"10.1080/1941658x.2012.696963","DOIUrl":"https://doi.org/10.1080/1941658x.2012.696963","url":null,"abstract":"","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131284474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-01-01DOI: 10.1080/1941658X.2012.682943
T. Miller, A. Dowling, David Youd, Eric J. Unger, E. White
Cumulative average and unit cost learning curve methodologies dominate current learning curve theory. Both models mathematically estimate the structure of costs over time and under particular conditions. While cost estimators and industries have shown preferences for particular models, this article evaluates model performance under varying program characteristics. A Monte Carlo approach is used to perform analysis and identify the superior method for use under differing programmatic factors and conditions. Decision charts are provided to aide analysts' learning curve model selection for aircraft production and modification programs. Overall, the results indicate that the unit theory outperforms the cumulative average theory when more than 40 units exist to create a prediction learning curve or when the data presents high learning and low variation in the program; however, the cumulative average theory predicts unit costs with less error when few units to create the curve exists, low learning occurs, and high variation transpires. This article not subject to US copyright law.
{"title":"Comparison of Cumulative Average to Unit Learning Curves: A Monte Carlo Approach","authors":"T. Miller, A. Dowling, David Youd, Eric J. Unger, E. White","doi":"10.1080/1941658X.2012.682943","DOIUrl":"https://doi.org/10.1080/1941658X.2012.682943","url":null,"abstract":"Cumulative average and unit cost learning curve methodologies dominate current learning curve theory. Both models mathematically estimate the structure of costs over time and under particular conditions. While cost estimators and industries have shown preferences for particular models, this article evaluates model performance under varying program characteristics. A Monte Carlo approach is used to perform analysis and identify the superior method for use under differing programmatic factors and conditions. Decision charts are provided to aide analysts' learning curve model selection for aircraft production and modification programs. Overall, the results indicate that the unit theory outperforms the cumulative average theory when more than 40 units exist to create a prediction learning curve or when the data presents high learning and low variation in the program; however, the cumulative average theory predicts unit costs with less error when few units to create the curve exists, low learning occurs, and high variation transpires. This article not subject to US copyright law.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131617549","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-01-01DOI: 10.1080/1941658X.2012.682922
C. Smart
Cost risk can be added to the list of the many phenomena in nature that follow a power-law probability distribution. Both the normal and lognormal, neither of which is a power-law distribution, underestimate the probability of extreme cost growth, as shown by comparison with empirical data. This situation puts the widely debated “portfolio effect” into further dispute. However, even though power laws are useful for modeling extreme events, budgets are not typically set at extreme percentiles, such as the 90th. Indeed, budgets are usually set at the 70th percentile or below. In addition, it is shown that the lognormal distribution is also problematic in that region and for percentile funding in general. To model cost risk for an individual program by setting budgets and/or reserves using percentile funding with a percentile chosen at or below the 70th percentile, it appears that the normal distribution may be the best option.
{"title":"The Fractal Nature of Cost Risk: The Portfolio Effect, Power Laws, and Risk and Uncertainty Properties of Lognormal Distributions","authors":"C. Smart","doi":"10.1080/1941658X.2012.682922","DOIUrl":"https://doi.org/10.1080/1941658X.2012.682922","url":null,"abstract":"Cost risk can be added to the list of the many phenomena in nature that follow a power-law probability distribution. Both the normal and lognormal, neither of which is a power-law distribution, underestimate the probability of extreme cost growth, as shown by comparison with empirical data. This situation puts the widely debated “portfolio effect” into further dispute. However, even though power laws are useful for modeling extreme events, budgets are not typically set at extreme percentiles, such as the 90th. Indeed, budgets are usually set at the 70th percentile or below. In addition, it is shown that the lognormal distribution is also problematic in that region and for percentile funding in general. To model cost risk for an individual program by setting budgets and/or reserves using percentile funding with a percentile chosen at or below the 70th percentile, it appears that the normal distribution may be the best option.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133068429","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-07-01DOI: 10.1080/1941658X.2011.628594
C. Grant Keaton, Edward D. White, Eric J. Unger
Government contractors report earned value information to government agencies in monthly contract performance reports. Though major differences may exist in the data between subsequent contract performance reports, we know of no government effort to detect these occurrences. The identification of major changes may locate and isolate problems and, thus, prevent million and billion dollar cost and schedule overruns. In this study, we illustrate a proof of concept approach to identify changes in the cost performance index and the schedule performance index that may indicate problems with contract performance. We find the intuitive detection algorithm identifies changes in the cost performance index and the schedule performance index that correspond to large changes in the Estimate at Complete from 1 to 12 months out. The ability to detect unusual changes provides decision-makers with warnings of potential problems for acquisition contracts.
{"title":"Using Earned Value Data to Detect Potential Problems in Acquisition Contracts","authors":"C. Grant Keaton, Edward D. White, Eric J. Unger","doi":"10.1080/1941658X.2011.628594","DOIUrl":"https://doi.org/10.1080/1941658X.2011.628594","url":null,"abstract":"Government contractors report earned value information to government agencies in monthly contract performance reports. Though major differences may exist in the data between subsequent contract performance reports, we know of no government effort to detect these occurrences. The identification of major changes may locate and isolate problems and, thus, prevent million and billion dollar cost and schedule overruns. In this study, we illustrate a proof of concept approach to identify changes in the cost performance index and the schedule performance index that may indicate problems with contract performance. We find the intuitive detection algorithm identifies changes in the cost performance index and the schedule performance index that correspond to large changes in the Estimate at Complete from 1 to 12 months out. The ability to detect unusual changes provides decision-makers with warnings of potential problems for acquisition contracts.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"238 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116176649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-07-01DOI: 10.1080/1941658x.2011.645774
{"title":"Editorial Board EOV","authors":"","doi":"10.1080/1941658x.2011.645774","DOIUrl":"https://doi.org/10.1080/1941658x.2011.645774","url":null,"abstract":"","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130196334","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-07-01DOI: 10.1080/1941658X.2011.627754
D. A. Lee, E. A. Long
We extend a well-established model of reliability growth in a reliability improvement program, the Army Materiel Systems Analysis Activity Maturity Projection Model (AMPM), to include a model of the program's cost. We show how the extended model may be used to plan cost-optimal or schedule-optimal integrated programs of reliability improvement and testing, from early design through developmental and operational testing, and illustrate the process with an example from an actual program.
{"title":"Estimating Cost and Schedule of Reliability Improvement","authors":"D. A. Lee, E. A. Long","doi":"10.1080/1941658X.2011.627754","DOIUrl":"https://doi.org/10.1080/1941658X.2011.627754","url":null,"abstract":"We extend a well-established model of reliability growth in a reliability improvement program, the Army Materiel Systems Analysis Activity Maturity Projection Model (AMPM), to include a model of the program's cost. We show how the extended model may be used to plan cost-optimal or schedule-optimal integrated programs of reliability improvement and testing, from early design through developmental and operational testing, and illustrate the process with an example from an actual program.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122668631","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-07-01DOI: 10.1080/1941658X.2011.627757
B. Gillespie, Darrell Hamilton
This article explores and discusses concepts surrounding the multi-step retrograde analysis process for learning curve production breaks that was popularized by George Anderlohr, in his 1969 Industrial Engineering article “What Production Breaks Cost.” Mr. Anderlohr based much of his analysis using the cumulative average curve method, but the basic principles have been widely accepted and used to calculate the equivalent calculation using the unit theory learning curves. Because Mr. Anderlohr's method is considered the standard for such calculations and because the method is relatively simple to perform, not much has been written to either simplify the process or to explain what appear to be anomalies in his methodology and other designated official sources such as that published by the Government Accountability Office (GAO). The article will briefly explore and answer the more vexing of the anomaly issues and then introduce a single closed-form equation to bypass the multi-step method which can save the cost analyst time and minimizes opportunities for trivial mathematical errors.
{"title":"A Closed-Form Solution for the Production-Break Retrograde Method","authors":"B. Gillespie, Darrell Hamilton","doi":"10.1080/1941658X.2011.627757","DOIUrl":"https://doi.org/10.1080/1941658X.2011.627757","url":null,"abstract":"This article explores and discusses concepts surrounding the multi-step retrograde analysis process for learning curve production breaks that was popularized by George Anderlohr, in his 1969 Industrial Engineering article “What Production Breaks Cost.” Mr. Anderlohr based much of his analysis using the cumulative average curve method, but the basic principles have been widely accepted and used to calculate the equivalent calculation using the unit theory learning curves. Because Mr. Anderlohr's method is considered the standard for such calculations and because the method is relatively simple to perform, not much has been written to either simplify the process or to explain what appear to be anomalies in his methodology and other designated official sources such as that published by the Government Accountability Office (GAO). The article will briefly explore and answer the more vexing of the anomaly issues and then introduce a single closed-form equation to bypass the multi-step method which can save the cost analyst time and minimizes opportunities for trivial mathematical errors.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130381904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}