Pub Date : 2014-05-04DOI: 10.1080/1941658X.2014.929987
Shannon L. Charles, Don R. Hansen
Accurate product costing information is instrumental in effective decision making, especially for product related decisions, such as product mix and product emphasis. A body of literature suggests that activity-based costing plays an important role in providing accurate product costing information. However, activity-based costing systems are also more complex due to the number of cost drivers identified, compared to traditional single cost driver systems. We propose a modification to the activity-based costing system that reduces the complexity. Using the concepts of linearly independent and dependent vectors, it is shown that it is possible to identify parsimonious systems that simplify an activity-based costing system without loss of accuracy. The modification presented in this study will entice more firms to adopt an activity-based costing system and reap the benefits of increased product costing accuracy.
{"title":"Activity-Based Parsimonious Cost Systems","authors":"Shannon L. Charles, Don R. Hansen","doi":"10.1080/1941658X.2014.929987","DOIUrl":"https://doi.org/10.1080/1941658X.2014.929987","url":null,"abstract":"Accurate product costing information is instrumental in effective decision making, especially for product related decisions, such as product mix and product emphasis. A body of literature suggests that activity-based costing plays an important role in providing accurate product costing information. However, activity-based costing systems are also more complex due to the number of cost drivers identified, compared to traditional single cost driver systems. We propose a modification to the activity-based costing system that reduces the complexity. Using the concepts of linearly independent and dependent vectors, it is shown that it is possible to identify parsimonious systems that simplify an activity-based costing system without loss of accuracy. The modification presented in this study will entice more firms to adopt an activity-based costing system and reap the benefits of increased product costing accuracy.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124763912","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-04DOI: 10.1080/1941658X.2014.922907
C. Smart
Risk allocation is the assignment of risk reserves from a total project or portfolio level to individual constituent elements. For example, cost risk at the total project level is allocated to individual work breakdown structure elements. This is a non-trivial exercise in most instances, because of issues related to the aggregation of risks, such as the fact that percentiles do not add. For example, if a project is funded at a 70% confidence level then one cannot simply allocate that funding to work breakdown structure elements by assigning each its 70% confidence level estimate. This is because the resulting sum may (but not necessarily will) be larger than the total 70% confidence estimate for the entire project. One method for allocating risk that has commonly been used in practice and has been implemented in a cost estimating integration software package is to assign risk by assigning the element’s standard deviation as a proportion of the sum of the standard deviations for all work breakdown structure elements (Sandberg, 2007). Another popular method notes that risk is typically not symmetric, and looks at the relative contribution of the element’s variation above the mean or other reference estimate. Dr. Steve Book first presented this concept to a limited Government audience in 1992 and presented it to a wider audience several years later (Book, 1992, 2006). This technique, based on the concept of “need,” has been implemented in the NASA/Air Force Cost Model (Smart, 2005). These contributions represent the current state-of-the-practice in cost analysis. The notion of positive semi-variance as an alternative to the needs method was brought forth by Book (2006) and further propounded by Sandberg (2007). A new method proposed by Hermann (personal communication, 2010) discusses the concept of optimality in risk allocation and proposes a one-sided moment objective function for calculating the optimal allocation. An older method, developed in the 1990s by Lockheed Martin, assigns equal percentile allocations for all work breakdown structure elements (Goldberg and Weber, 1998). This method claims to be optimal, and Goldberg and Weber (1998) show that under a very specific assumption, that this is true. Aside from Hermann’s paper and the report by Goldberg and Weber on the Lockheed Martin method, cost risk allocation has typically not been associated with optimality. Neither the proportional standard deviation method nor the needs method guarantees the allocation scheme will be optimal or even necessarily desirable. Indeed, the twin topics of risk measurement and risk allocation have either been treated independently (Book, 2006), or they have been treated as one and the same (Sandberg, 2007). Regardless, the current situation is muddled, with no clear delineation between the two. In this article, the present author introduces to cost analysis the concept of gradient risk allocation, which has been recently used in the areas of finance and insuran
{"title":"Cost Risk Allocation Theory and Practice","authors":"C. Smart","doi":"10.1080/1941658X.2014.922907","DOIUrl":"https://doi.org/10.1080/1941658X.2014.922907","url":null,"abstract":"Risk allocation is the assignment of risk reserves from a total project or portfolio level to individual constituent elements. For example, cost risk at the total project level is allocated to individual work breakdown structure elements. This is a non-trivial exercise in most instances, because of issues related to the aggregation of risks, such as the fact that percentiles do not add. For example, if a project is funded at a 70% confidence level then one cannot simply allocate that funding to work breakdown structure elements by assigning each its 70% confidence level estimate. This is because the resulting sum may (but not necessarily will) be larger than the total 70% confidence estimate for the entire project. One method for allocating risk that has commonly been used in practice and has been implemented in a cost estimating integration software package is to assign risk by assigning the element’s standard deviation as a proportion of the sum of the standard deviations for all work breakdown structure elements (Sandberg, 2007). Another popular method notes that risk is typically not symmetric, and looks at the relative contribution of the element’s variation above the mean or other reference estimate. Dr. Steve Book first presented this concept to a limited Government audience in 1992 and presented it to a wider audience several years later (Book, 1992, 2006). This technique, based on the concept of “need,” has been implemented in the NASA/Air Force Cost Model (Smart, 2005). These contributions represent the current state-of-the-practice in cost analysis. The notion of positive semi-variance as an alternative to the needs method was brought forth by Book (2006) and further propounded by Sandberg (2007). A new method proposed by Hermann (personal communication, 2010) discusses the concept of optimality in risk allocation and proposes a one-sided moment objective function for calculating the optimal allocation. An older method, developed in the 1990s by Lockheed Martin, assigns equal percentile allocations for all work breakdown structure elements (Goldberg and Weber, 1998). This method claims to be optimal, and Goldberg and Weber (1998) show that under a very specific assumption, that this is true. Aside from Hermann’s paper and the report by Goldberg and Weber on the Lockheed Martin method, cost risk allocation has typically not been associated with optimality. Neither the proportional standard deviation method nor the needs method guarantees the allocation scheme will be optimal or even necessarily desirable. Indeed, the twin topics of risk measurement and risk allocation have either been treated independently (Book, 2006), or they have been treated as one and the same (Sandberg, 2007). Regardless, the current situation is muddled, with no clear delineation between the two. In this article, the present author introduces to cost analysis the concept of gradient risk allocation, which has been recently used in the areas of finance and insuran","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132161472","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-05-04DOI: 10.1080/1941658X.2014.922905
Donald M. Beckett
Measurement’s goal is to help assess performance—to determine which methods are productive or counterproductive. Metrics are tools used to identify and implement practices that lower costs, reduce time to market, and improve product quality. But process improvement is not accomplished through measurement or metrics alone. Rather, one must use the data to make conscious decisions that change the way business is done. In fact, one of best ways to make those decisions is by studying the characteristics of best- and worst-in-class software projects. Referencing Quantitative Software Management’s database of 10,000+ completed software projects, this article evaluates the common factors that define the most and least successful engineering projects—drawn from the database’s System Software, Scientific, Telecom, and Command and Control application domains. Presenting a thorough analysis of project staffing, effort, duration, cost, and quality data, this article gives project managers a solid, scientific framework for evaluating potential projects and identifying winning strategies.
{"title":"Engineering Systems: Best-in-Class/Worst-in-Class","authors":"Donald M. Beckett","doi":"10.1080/1941658X.2014.922905","DOIUrl":"https://doi.org/10.1080/1941658X.2014.922905","url":null,"abstract":"Measurement’s goal is to help assess performance—to determine which methods are productive or counterproductive. Metrics are tools used to identify and implement practices that lower costs, reduce time to market, and improve product quality. But process improvement is not accomplished through measurement or metrics alone. Rather, one must use the data to make conscious decisions that change the way business is done. In fact, one of best ways to make those decisions is by studying the characteristics of best- and worst-in-class software projects. Referencing Quantitative Software Management’s database of 10,000+ completed software projects, this article evaluates the common factors that define the most and least successful engineering projects—drawn from the database’s System Software, Scientific, Telecom, and Command and Control application domains. Presenting a thorough analysis of project staffing, effort, duration, cost, and quality data, this article gives project managers a solid, scientific framework for evaluating potential projects and identifying winning strategies.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130115416","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-01-02DOI: 10.1080/1941658X.2014.890493
Capers Jones
This article discusses 20 quantitative targets for software engineering projects that are technically feasible to be achieved within a five year window. Some leading companies have already achieved many of these targets, but average and lagging companies have achieved only a few, if any. Software needs firm achievable goals expressed in quantitative fashion. For example the first goal is to remove 99.5% of bugs or defects before delivery rather than today’s average of below 90%. Inspections and static analysis prior to testing can achieve this goal for essentially every project.
{"title":"Software Industry Goals for the Years 2014 through 2018","authors":"Capers Jones","doi":"10.1080/1941658X.2014.890493","DOIUrl":"https://doi.org/10.1080/1941658X.2014.890493","url":null,"abstract":"This article discusses 20 quantitative targets for software engineering projects that are technically feasible to be achieved within a five year window. Some leading companies have already achieved many of these targets, but average and lagging companies have achieved only a few, if any. Software needs firm achievable goals expressed in quantitative fashion. For example the first goal is to remove 99.5% of bugs or defects before delivery rather than today’s average of below 90%. Inspections and static analysis prior to testing can achieve this goal for essentially every project.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126274193","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-01-02DOI: 10.1080/1941658X.2014.890087
David W. Lee, Carol Dezwarte, Stephanie M. Sigalas-Markham, Jeremy M. Eckhause
A model of the cost and schedule of a development program, characterized by three non-dimensional parameters, gives means for estimating the cost and schedule impacts of constraining funding below planned levels, as well as for assessing the realism of the costs and schedules of planned programs. In contrast to models of the Norden-Rayleigh-Weibull class, the model explicitly considers specific components of cost, and captures the distinction between a development program’s value (i.e., the things delivered) and its cost (i.e., the money paid to acquire the value). Treating staff levels, staff productivity and cost, overhead, purchased material costs, and the burdens imposed by staff coordination and by allocating a program’s effort to individual workers or teams and collating the results, the model reflects effects of management actions to make programs optimal by such criteria as minimal cost, minimal time, or minimal cost subject to a maximum-time constraint.
{"title":"Applications of a Parsimonious Model of Development Programs’ Costs and Schedules","authors":"David W. Lee, Carol Dezwarte, Stephanie M. Sigalas-Markham, Jeremy M. Eckhause","doi":"10.1080/1941658X.2014.890087","DOIUrl":"https://doi.org/10.1080/1941658X.2014.890087","url":null,"abstract":"A model of the cost and schedule of a development program, characterized by three non-dimensional parameters, gives means for estimating the cost and schedule impacts of constraining funding below planned levels, as well as for assessing the realism of the costs and schedules of planned programs. In contrast to models of the Norden-Rayleigh-Weibull class, the model explicitly considers specific components of cost, and captures the distinction between a development program’s value (i.e., the things delivered) and its cost (i.e., the money paid to acquire the value). Treating staff levels, staff productivity and cost, overhead, purchased material costs, and the burdens imposed by staff coordination and by allocating a program’s effort to individual workers or teams and collating the results, the model reflects effects of management actions to make programs optimal by such criteria as minimal cost, minimal time, or minimal cost subject to a maximum-time constraint.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130362027","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-01-02DOI: 10.1080/1941658X.2014.890086
D. Howarth
This article shows how to bound, build, and assemble trade spaces for product optimization. The advent of computerized tools that describe available trade spaces has changed not only the nature of optimized product design, but that of parametric cost studies as well. Because these tools allow broader analysis, engineers can produce many more potential designs. Those tools, explained in this article, allow parametricians to analyze trade spaces in a manner that allows them to determine the sets of product attributes that have the best chance for market success. However, rather than trailing such engineering studies, parametricians may be able to lead them. In the process, parametricians may be able to move their organizations toward more economically viable configurations, those that markets will more readily accept.
{"title":"Trade Space, Product Optimization, and Parametric Analysis","authors":"D. Howarth","doi":"10.1080/1941658X.2014.890086","DOIUrl":"https://doi.org/10.1080/1941658X.2014.890086","url":null,"abstract":"This article shows how to bound, build, and assemble trade spaces for product optimization. The advent of computerized tools that describe available trade spaces has changed not only the nature of optimized product design, but that of parametric cost studies as well. Because these tools allow broader analysis, engineers can produce many more potential designs. Those tools, explained in this article, allow parametricians to analyze trade spaces in a manner that allows them to determine the sets of product attributes that have the best chance for market success. However, rather than trailing such engineering studies, parametricians may be able to lead them. In the process, parametricians may be able to move their organizations toward more economically viable configurations, those that markets will more readily accept.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124766552","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-01-02DOI: 10.1080/1941658X.2014.891085
Mark Jones, P. Webb, M. Summers, P. Baguley
A detailed analysis of the available literature and the aerospace manufacturing industry has identified a lack of cost estimation techniques to forecast advanced manufacturing technology development effort and hardware cost. To respond, this article presents two parametric ‘Constructive Technology Development Cost Models’ (COTECHMO). The COTECHMO Resources model is the first and is capable of forecasting aerospace advanced manufacturing technology development effort in person-hours. When statistically analyzed, this model had an outstanding R-squared value of 98% and a high F-value of 106.65, validating model significance. The general model accuracy was tested with 53% of the forecast data falling within 20% of the actual. The second, the COTECHMO Direct Cost model is capable of forecasting the development cost of the aerospace advanced manufacturing technology process hardware. This model had an inferior R-squared value of 76% and an F-value of 5.59, although each was still valid to determine model significance. However, the Direct Cost model accuracy exceeded the Resources model, with 93% of the forecast data falling within 20% of the actual. The article concludes with recommendations for future research, including suggestions for further enhancement of each model verification and validation, within and outside of the supporting organization.
{"title":"COTECHMO: The Constructive Technology Development Cost Model","authors":"Mark Jones, P. Webb, M. Summers, P. Baguley","doi":"10.1080/1941658X.2014.891085","DOIUrl":"https://doi.org/10.1080/1941658X.2014.891085","url":null,"abstract":"A detailed analysis of the available literature and the aerospace manufacturing industry has identified a lack of cost estimation techniques to forecast advanced manufacturing technology development effort and hardware cost. To respond, this article presents two parametric ‘Constructive Technology Development Cost Models’ (COTECHMO). The COTECHMO Resources model is the first and is capable of forecasting aerospace advanced manufacturing technology development effort in person-hours. When statistically analyzed, this model had an outstanding R-squared value of 98% and a high F-value of 106.65, validating model significance. The general model accuracy was tested with 53% of the forecast data falling within 20% of the actual. The second, the COTECHMO Direct Cost model is capable of forecasting the development cost of the aerospace advanced manufacturing technology process hardware. This model had an inferior R-squared value of 76% and an F-value of 5.59, although each was still valid to determine model significance. However, the Direct Cost model accuracy exceeded the Resources model, with 93% of the forecast data falling within 20% of the actual. The article concludes with recommendations for future research, including suggestions for further enhancement of each model verification and validation, within and outside of the supporting organization.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"103 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114471012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-07-01DOI: 10.1080/1941658x.2013.865423
{"title":"Editorial Board EOV","authors":"","doi":"10.1080/1941658x.2013.865423","DOIUrl":"https://doi.org/10.1080/1941658x.2013.865423","url":null,"abstract":"","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116303200","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-07-01DOI: 10.1080/1941658X.2013.843420
R. Nehring, Katharine Mann, Robert Jones
This article presents a new kind of chart, called a Galaxy chart, which combines the strengths of other chart types. A Galaxy chart displays an entire Cost Element Structure on a single sheet of paper, showing all of the elements, their relationships, and their costs in a visually appealing way. Each child cost element is “in orbit” around its parent, with its children “in orbit” around their parent. The size of each cost element is directly proportional to its magnitude. Galaxy charts provide many insights. For instance, a single Galaxy chart displays the cost element structure hierarchy, the most significant cost elements, the least significant cost elements, the descending order of cost elements, and cost elements of equal value. This article will give an overview of the Galaxy chart concept, explain how to construct one, and explain a few of the insights that are available from the display insights that are typically difficult to gain without using a Galaxy chart.
{"title":"Galaxy Charts: The 1,000-Light-Year View of the Data","authors":"R. Nehring, Katharine Mann, Robert Jones","doi":"10.1080/1941658X.2013.843420","DOIUrl":"https://doi.org/10.1080/1941658X.2013.843420","url":null,"abstract":"This article presents a new kind of chart, called a Galaxy chart, which combines the strengths of other chart types. A Galaxy chart displays an entire Cost Element Structure on a single sheet of paper, showing all of the elements, their relationships, and their costs in a visually appealing way. Each child cost element is “in orbit” around its parent, with its children “in orbit” around their parent. The size of each cost element is directly proportional to its magnitude. Galaxy charts provide many insights. For instance, a single Galaxy chart displays the cost element structure hierarchy, the most significant cost elements, the least significant cost elements, the descending order of cost elements, and cost elements of equal value. This article will give an overview of the Galaxy chart concept, explain how to construct one, and explain a few of the insights that are available from the display insights that are typically difficult to gain without using a Galaxy chart.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130005963","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-07-01DOI: 10.1080/1941658X.2013.843423
E. Barkel, T. Yalkin
The mandate of the Parliamentary Budget Officer is to provide independent analysis to Parliament on the state of the nation's finances, the government's estimates, and trends in the Canadian economy, and, upon request from a committee or parliamentarian, to estimate the financial cost of any proposal for matters over which Parliament has jurisdiction. The PBO received requests from the Member from St John's East and the Member from Scarborough-Guildwood to undertake an independent cost assessment of the Joint Support Ship project. This report assesses the feasibility of replacing Canada's current Auxiliary Oiler Replenishment ships with two Joint Support Ships within the allocated funding envelope. The cost estimates and observations presented in this report represent a preliminary set of data for discussion and may change subject to the provision of detailed financial and non-financial data to the Parliamentary Budget Officer by the Department of National Defence, Public Works, and Government Services Canada, and the shipyards. The cost estimates included reflect a point-in-time set of observations based on limited and high-level data obtained from a variety of sources. These high-level cost estimates and observations are neither to be viewed as conclusions in relation to the policy merits of the legislation nor as a view to future costs.
{"title":"Feasibility of Budget for Acquisition of Two Joint Support Ships","authors":"E. Barkel, T. Yalkin","doi":"10.1080/1941658X.2013.843423","DOIUrl":"https://doi.org/10.1080/1941658X.2013.843423","url":null,"abstract":"The mandate of the Parliamentary Budget Officer is to provide independent analysis to Parliament on the state of the nation's finances, the government's estimates, and trends in the Canadian economy, and, upon request from a committee or parliamentarian, to estimate the financial cost of any proposal for matters over which Parliament has jurisdiction. The PBO received requests from the Member from St John's East and the Member from Scarborough-Guildwood to undertake an independent cost assessment of the Joint Support Ship project. This report assesses the feasibility of replacing Canada's current Auxiliary Oiler Replenishment ships with two Joint Support Ships within the allocated funding envelope. The cost estimates and observations presented in this report represent a preliminary set of data for discussion and may change subject to the provision of detailed financial and non-financial data to the Parliamentary Budget Officer by the Department of National Defence, Public Works, and Government Services Canada, and the shipyards. The cost estimates included reflect a point-in-time set of observations based on limited and high-level data obtained from a variety of sources. These high-level cost estimates and observations are neither to be viewed as conclusions in relation to the policy merits of the legislation nor as a view to future costs.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126061676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}