Pub Date : 2016-09-01DOI: 10.1080/1941658X.2016.1267456
R. Valerdi
Many decisions benefit from situations where there exist both ample expert opinion and historical data. In cost modeling these may include the costs of software development, the learning curve rates for specific manufacturing tasks, and the unit rate costs of operating certain products. When making forecasts we are often faced with the decision to base our estimates on either expert opinion or historical data. When these two perspectives converge, we have high confidence in the estimate. The more interesting case is when they contradict. This is where the estimator needs to dig deeper in order to determine the sources of inconsistencies. Cost modelers are not the only ones who struggle with deciding whether to trust experts or data. Data scientists are increasingly dealing with this duality especially in the context of professional sports where expert opinion is associated with the traditional viewpoint and data-driven decision making is associated with a more modern approach. In the United States, professional sports teams are increasingly using analytics to optimize their athletes’ performance as well as their business operations (Pelton, 2015). But the culture of professional sports still depends heavily on experience and gut feel. The case of baseball umpires provides a good example of expert opinion being preferred over historical data. In professional baseball, the umpire’s job is to determine whether the ball passed the strike zone1 or not. If the batter does not swing it is left to the umpire’s expert judgement to identify whether the pitch was a ball or a strike. The strike zone is defined in the official rules of baseball and are not subject to interpretation, however, the implementation of measuring said strike zone is entirely left to human judgement. Even more challenging is that the decision must be made in a matter of seconds under extreme pressure. Chen, Moskowitz, and Shue (2016) analyzed baseball umpire data using the PITCHf/x system that tracks the actual location of each pitch using multiple cameras. By comparing the umpire’s decision to the actual placement of the ball relative to the strike zone they determined that, during the 2008 to 2016 seasons which included 127 different umpires calling over 3.5 million pitches, umpires were correct only part of the time as shown in Table 1. If baseball umpires are getting one out of every eight ball/strike calls wrong, this adds up to more than 30,000 mistakes a year. In most industries, and even other professional sports leagues, this would be unacceptable but baseball traditionalists are hesitant to adopt new technologies that remove the human element from the game.
{"title":"Balancing Expert Opinion and Historical Data: The Case of Baseball Umpires","authors":"R. Valerdi","doi":"10.1080/1941658X.2016.1267456","DOIUrl":"https://doi.org/10.1080/1941658X.2016.1267456","url":null,"abstract":"Many decisions benefit from situations where there exist both ample expert opinion and historical data. In cost modeling these may include the costs of software development, the learning curve rates for specific manufacturing tasks, and the unit rate costs of operating certain products. When making forecasts we are often faced with the decision to base our estimates on either expert opinion or historical data. When these two perspectives converge, we have high confidence in the estimate. The more interesting case is when they contradict. This is where the estimator needs to dig deeper in order to determine the sources of inconsistencies. Cost modelers are not the only ones who struggle with deciding whether to trust experts or data. Data scientists are increasingly dealing with this duality especially in the context of professional sports where expert opinion is associated with the traditional viewpoint and data-driven decision making is associated with a more modern approach. In the United States, professional sports teams are increasingly using analytics to optimize their athletes’ performance as well as their business operations (Pelton, 2015). But the culture of professional sports still depends heavily on experience and gut feel. The case of baseball umpires provides a good example of expert opinion being preferred over historical data. In professional baseball, the umpire’s job is to determine whether the ball passed the strike zone1 or not. If the batter does not swing it is left to the umpire’s expert judgement to identify whether the pitch was a ball or a strike. The strike zone is defined in the official rules of baseball and are not subject to interpretation, however, the implementation of measuring said strike zone is entirely left to human judgement. Even more challenging is that the decision must be made in a matter of seconds under extreme pressure. Chen, Moskowitz, and Shue (2016) analyzed baseball umpire data using the PITCHf/x system that tracks the actual location of each pitch using multiple cameras. By comparing the umpire’s decision to the actual placement of the ball relative to the strike zone they determined that, during the 2008 to 2016 seasons which included 127 different umpires calling over 3.5 million pitches, umpires were correct only part of the time as shown in Table 1. If baseball umpires are getting one out of every eight ball/strike calls wrong, this adds up to more than 30,000 mistakes a year. In most industries, and even other professional sports leagues, this would be unacceptable but baseball traditionalists are hesitant to adopt new technologies that remove the human element from the game.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116629576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-09-01DOI: 10.1080/1941658X.2016.1266974
M. Dugan, B. Ewing, M. A. Thompson
Construction projects often require multiple years to complete and the costs of supplies, materials, and labor may increase substantially during a project’s time span. As a result, construction contracts often include an escalation clause to account for cost increases. This article examines the time-series properties of new building construction costs using several producer price indexes. Using a battery of unit root tests, we find substantial evidence that construction cost indexes are generally nonstationary. This finding has implications for the proper specification and use of these series in contract escalation clauses and their respective use in forecasting construction cost increases.
{"title":"Dynamics of New Building Construction Costs: Implications for Forecasting Escalation Allowances","authors":"M. Dugan, B. Ewing, M. A. Thompson","doi":"10.1080/1941658X.2016.1266974","DOIUrl":"https://doi.org/10.1080/1941658X.2016.1266974","url":null,"abstract":"Construction projects often require multiple years to complete and the costs of supplies, materials, and labor may increase substantially during a project’s time span. As a result, construction contracts often include an escalation clause to account for cost increases. This article examines the time-series properties of new building construction costs using several producer price indexes. Using a battery of unit root tests, we find substantial evidence that construction cost indexes are generally nonstationary. This finding has implications for the proper specification and use of these series in contract escalation clauses and their respective use in forecasting construction cost increases.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"128 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116175330","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-09-01DOI: 10.1080/1941658X.2016.1251348
Gabriel Soares Zica Bergo, Bruna Hoffmeister Lucas, Vinicius Amorim Sobreiro, M. S. Nagano
This work addresses the problem of reallocating productive resources to maximize profit. Most contributions to the topic focus on developing or improving the Cost-Volume-Profit model to obtain solutions that provide an ideal mix of products before the data is given. In particular, some algorithms are available for the problem, such as the ones proposed by Kakumanu and Shao and Feng. However, these proposals do not consider the minimum number of units to be produced, and the reallocation of productive resources for each product is a problem found in these studies. Bearing this in mind, a new algorithm based on individual financial revenue is proposed. Computational results indicate that the proposed method can be utilized as a decision support system.
{"title":"Multiproduct Cost-Volume-Profit Model: A Resource Reallocation Approach for Decision Making","authors":"Gabriel Soares Zica Bergo, Bruna Hoffmeister Lucas, Vinicius Amorim Sobreiro, M. S. Nagano","doi":"10.1080/1941658X.2016.1251348","DOIUrl":"https://doi.org/10.1080/1941658X.2016.1251348","url":null,"abstract":"This work addresses the problem of reallocating productive resources to maximize profit. Most contributions to the topic focus on developing or improving the Cost-Volume-Profit model to obtain solutions that provide an ideal mix of products before the data is given. In particular, some algorithms are available for the problem, such as the ones proposed by Kakumanu and Shao and Feng. However, these proposals do not consider the minimum number of units to be produced, and the reallocation of productive resources for each product is a problem found in these studies. Bearing this in mind, a new algorithm based on individual financial revenue is proposed. Computational results indicate that the proposed method can be utilized as a decision support system.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116172888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-09-01DOI: 10.1080/1941658x.2016.1267459
{"title":"Editorial Board EOV","authors":"","doi":"10.1080/1941658x.2016.1267459","DOIUrl":"https://doi.org/10.1080/1941658x.2016.1267459","url":null,"abstract":"","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115460360","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-09-01DOI: 10.1080/1941658X.2016.1267598
D. Charoenphol, Steven M. F. Stuban, J. Dever
The objective of this study is to demonstrate the application of the bootstrapping M-estimator (a robust analysis of variance [ANOVA]) to test the null hypothesis of means equality among the cost performance of the three project delivery systems (PDS). A statistical planned contrast methodology is utilized after the robust ANOVA analysis to further determine where the differences of the means lie. The results of this research concluded that traditional PDS (Design-Bid-Build [DBB]) outperformed the two innovative PDS (Design-Build [DB] and Construction Manager/General Contractor [CMGC]), DBB and CMGC outperformed DB, and DBB outperformed CMGC, for the Cost Growth and the Change Order Cost Factor performance. These findings can help decision makers/owners make an informed decision regarding cost related aspects when choosing PDS for their projects. Though the case study of this research is based on the sample data obtained from the construction industry, the same methodology and statistical process can be applied to other industries and factors/variables of interest when the study sample data are unbalanced and the normality and homogeneity of variance assumptions are violated.
{"title":"Using Robust Statistical Methodology to Evaluate the Cost Performance of Project Delivery Systems: A Case Study of Horizontal Construction","authors":"D. Charoenphol, Steven M. F. Stuban, J. Dever","doi":"10.1080/1941658X.2016.1267598","DOIUrl":"https://doi.org/10.1080/1941658X.2016.1267598","url":null,"abstract":"The objective of this study is to demonstrate the application of the bootstrapping M-estimator (a robust analysis of variance [ANOVA]) to test the null hypothesis of means equality among the cost performance of the three project delivery systems (PDS). A statistical planned contrast methodology is utilized after the robust ANOVA analysis to further determine where the differences of the means lie. The results of this research concluded that traditional PDS (Design-Bid-Build [DBB]) outperformed the two innovative PDS (Design-Build [DB] and Construction Manager/General Contractor [CMGC]), DBB and CMGC outperformed DB, and DBB outperformed CMGC, for the Cost Growth and the Change Order Cost Factor performance. These findings can help decision makers/owners make an informed decision regarding cost related aspects when choosing PDS for their projects. Though the case study of this research is based on the sample data obtained from the construction industry, the same methodology and statistical process can be applied to other industries and factors/variables of interest when the study sample data are unbalanced and the normality and homogeneity of variance assumptions are violated.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128477948","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-05-03DOI: 10.1080/1941658X.2016.1191388
Shu-Ping Hu
Two popular regression methods for the multiplicative-error model are the Minimum-Unbiased-Percent Error and Minimum-Percentage Error under the Zero-Percentage Bias methods. The Minimum-Unbiased-Percent Error method, an Iteratively Reweighted Least Squares regression, does not use any constraints, while the Minimum-Percentage Error under the Zero-Percentage Bias method requires a constraint as part of the curve-fitting process. However, Minimum-Percentage Error under the Zero-Percentage Bias users do not adjust the degrees of freedom to account for constraints included in the regression process. As a result, fit statistics for the Minimum-Percentage Error under the Zero-Percentage bias equations, e.g., the standard percent error and generalized R2, can be incorrect and misleading. This results in incompatible fit statistics between Minimum-Percentage Error under the Zero-Percentage Bias and Minimum-Unbiased-Percent Error equations. This article details why degrees of freedom should be adjusted and recommends a Generalized Degrees of Freedom measure to calculate fit statistics for constraint-driven cost estimating relationships. It also explains why Minimum-Percentage Error under the Zero-Percentage Bias’s standard error underestimates the spread of the cost estimating relationship error distribution. Illustrative examples are provided. Note that this article only considers equality constraints; Generalized Degrees of Freedom for inequality constraints is another topic.
{"title":"Generalized Degrees of Freedom","authors":"Shu-Ping Hu","doi":"10.1080/1941658X.2016.1191388","DOIUrl":"https://doi.org/10.1080/1941658X.2016.1191388","url":null,"abstract":"Two popular regression methods for the multiplicative-error model are the Minimum-Unbiased-Percent Error and Minimum-Percentage Error under the Zero-Percentage Bias methods. The Minimum-Unbiased-Percent Error method, an Iteratively Reweighted Least Squares regression, does not use any constraints, while the Minimum-Percentage Error under the Zero-Percentage Bias method requires a constraint as part of the curve-fitting process. However, Minimum-Percentage Error under the Zero-Percentage Bias users do not adjust the degrees of freedom to account for constraints included in the regression process. As a result, fit statistics for the Minimum-Percentage Error under the Zero-Percentage bias equations, e.g., the standard percent error and generalized R2, can be incorrect and misleading. This results in incompatible fit statistics between Minimum-Percentage Error under the Zero-Percentage Bias and Minimum-Unbiased-Percent Error equations. This article details why degrees of freedom should be adjusted and recommends a Generalized Degrees of Freedom measure to calculate fit statistics for constraint-driven cost estimating relationships. It also explains why Minimum-Percentage Error under the Zero-Percentage Bias’s standard error underestimates the spread of the cost estimating relationship error distribution. Illustrative examples are provided. Note that this article only considers equality constraints; Generalized Degrees of Freedom for inequality constraints is another topic.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116782700","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-05-03DOI: 10.1080/1941658X.2016.1201024
Christopher A. Jimenez, E. White, Gregory Brown, J. Ritschel, B. Lucas, Michael J. Seibel
Accurately predicting a realistic schedule for a defense acquisition program is a difficult challenge considering the inherent risk and uncertainties present in the early stages of a program. Through the application of multiple regression modeling, we provide the program manager with a statistical model that predicts schedule duration from official program initiation, which occurs at Milestone B, to the initial operational capability of the program’s deliverable system. Our model explains 42.9% of the variation in schedule duration across historical data from a sample of 56 defense programs from all military services. Statistically significant predictor variables include whether a program is a new effort or modification to an existing program, the year of Milestone B start as it relates to changes in defense acquisition reform policy, and the amount of raw funding (adjusted for inflation) prior to Milestone B for a program. Our final and strongest predictor variable, percentage of the total RDT&E (Research Development Test and Evaluation) funding profile allocated at Milestone B, indicates that increased percentage of RDT&E funding for pre-Milestone B technology risk reduction may shorten a program’s schedule duration to initial operational capability.
{"title":"Using Pre-Milestone B Data to Predict Schedule Duration for Defense Acquisition Programs","authors":"Christopher A. Jimenez, E. White, Gregory Brown, J. Ritschel, B. Lucas, Michael J. Seibel","doi":"10.1080/1941658X.2016.1201024","DOIUrl":"https://doi.org/10.1080/1941658X.2016.1201024","url":null,"abstract":"Accurately predicting a realistic schedule for a defense acquisition program is a difficult challenge considering the inherent risk and uncertainties present in the early stages of a program. Through the application of multiple regression modeling, we provide the program manager with a statistical model that predicts schedule duration from official program initiation, which occurs at Milestone B, to the initial operational capability of the program’s deliverable system. Our model explains 42.9% of the variation in schedule duration across historical data from a sample of 56 defense programs from all military services. Statistically significant predictor variables include whether a program is a new effort or modification to an existing program, the year of Milestone B start as it relates to changes in defense acquisition reform policy, and the amount of raw funding (adjusted for inflation) prior to Milestone B for a program. Our final and strongest predictor variable, percentage of the total RDT&E (Research Development Test and Evaluation) funding profile allocated at Milestone B, indicates that increased percentage of RDT&E funding for pre-Milestone B technology risk reduction may shorten a program’s schedule duration to initial operational capability.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116318650","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-05-03DOI: 10.1080/1941658X.2016.1201023
R. Rehan, Rizwan Younis, A. Unger, B. Shapton, Filip Budimir, M. Knight
The objective of this work is to develop a unit cost database and index for water and wastewater pipelines capital works, and estimate inflation in their construction cost. This was accomplished by analyzing tender summaries and progress certificates from the cities of Niagara Falls and Waterloo, Ontario, Canada, that span the period from 1980 to 2008, as well as using data from RS Means construction cost database. This work describes the source data, data preparation procedure, and development of unit cost database and indices. The process first involved developing scaling relationships between the cost of standard components and their sizes by regression analysis using data from tender summaries and the RS Means database. Next, unit costs of reference watermain and sanitary sewer projects and standard components are computed. Finally, a relational database is developed to store the data and to perform the unit cost analysis.
{"title":"Development of Unit Cost Indices and Database for Water and Wastewater Pipelines Capital Works","authors":"R. Rehan, Rizwan Younis, A. Unger, B. Shapton, Filip Budimir, M. Knight","doi":"10.1080/1941658X.2016.1201023","DOIUrl":"https://doi.org/10.1080/1941658X.2016.1201023","url":null,"abstract":"The objective of this work is to develop a unit cost database and index for water and wastewater pipelines capital works, and estimate inflation in their construction cost. This was accomplished by analyzing tender summaries and progress certificates from the cities of Niagara Falls and Waterloo, Ontario, Canada, that span the period from 1980 to 2008, as well as using data from RS Means construction cost database. This work describes the source data, data preparation procedure, and development of unit cost database and indices. The process first involved developing scaling relationships between the cost of standard components and their sizes by regression analysis using data from tender summaries and the RS Means database. Next, unit costs of reference watermain and sanitary sewer projects and standard components are computed. Finally, a relational database is developed to store the data and to perform the unit cost analysis.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"102 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123342470","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-01-02DOI: 10.1080/1941658X.2016.1155184
Rolando A. Berríos-Montero, Steven M. F. Stuban, J. Dever
The present study introduces a new approach to estimate the recovery costs of public property in the aftermath of a storm, by integrating geographic information systems. Estimating recovery costs for a disaster is a current concern for emergency responders. This work focuses on applying economic indicators, population, and storm event tracking to geographic information systems for rapidly estimating recovery costs. Firstly, recovery costs of historical events are normalized and adjusted for inflation, wealth, and population. Geospatial analysis is used to predict, manage, and learn political boundaries and population density. Secondly, rapid recovery cost estimation is accomplished by defining population, personal income, and gross domestic product. Finally, a jurisdiction fiscal capacity is calculated illustrating the economic capability of jurisdictions to finance public property recovery based on their economy size. The variability of estimated absolute errors between cost estimates and actual normalized costs are also examined. Our results reveal that jurisdiction fiscal capacity is a more suitable metric for rapidly estimating recovery costs of public properties than the method presently followed by the Federal Emergency Management Agency. This new approach effectively aids the local government providing quick cost guidance to recovery responders, while offering the ability to construct accurate recovery cost.
{"title":"Rapid Cost Estimation for Storms Recovery Using Geographic Information Systems","authors":"Rolando A. Berríos-Montero, Steven M. F. Stuban, J. Dever","doi":"10.1080/1941658X.2016.1155184","DOIUrl":"https://doi.org/10.1080/1941658X.2016.1155184","url":null,"abstract":"The present study introduces a new approach to estimate the recovery costs of public property in the aftermath of a storm, by integrating geographic information systems. Estimating recovery costs for a disaster is a current concern for emergency responders. This work focuses on applying economic indicators, population, and storm event tracking to geographic information systems for rapidly estimating recovery costs. Firstly, recovery costs of historical events are normalized and adjusted for inflation, wealth, and population. Geospatial analysis is used to predict, manage, and learn political boundaries and population density. Secondly, rapid recovery cost estimation is accomplished by defining population, personal income, and gross domestic product. Finally, a jurisdiction fiscal capacity is calculated illustrating the economic capability of jurisdictions to finance public property recovery based on their economy size. The variability of estimated absolute errors between cost estimates and actual normalized costs are also examined. Our results reveal that jurisdiction fiscal capacity is a more suitable metric for rapidly estimating recovery costs of public properties than the method presently followed by the Federal Emergency Management Agency. This new approach effectively aids the local government providing quick cost guidance to recovery responders, while offering the ability to construct accurate recovery cost.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129822005","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-01-02DOI: 10.1080/1941658X.2016.1155187
Rizwan Younis, R. Rehan, A. Unger, Soonyoung Yu, M. Knight
Municipalities and water utilities need to make realistic estimates for the replacement of their aged water and wastewater pipelines. The two main objectives of this article are to present a method to forecast the unit price of water and wastewater pipelines capital works by investigating inflation in their construction price, and to quantify the markup that contractors add to bid a project price. The Geometric Brownian Motion model with drift is used for investigation. Results show that the inflation in water and wastewater pipelines reference projects were 6.41% and 5.52% per annum, respectively. These values compare to the inflation in the Standard & Poor’s/Toronto Stock Exchange (S&P/TSX) Composite Index of 6.93% per annum. In contrast, inflation in Canada’s Consumer Price Index (CPI), and Engineering News-Record’s Construction Cost Index (ENR’s CCI) for Toronto are estimated to be 2.53% and 2.85% per annum, respectively. The spread in the inflation rate between the reference price indices and that of either ENR’s CCI or CPI is a measure of the market price of catchall financial premium (defined as markup) that contractors add to project cost to account for profit, risk, and market conditions. This spread is estimated to be 3.56% and 2.67% per annum for water and wastewater pipeline capital works, respectively.
{"title":"Forecasting the Unit Price of Water and Wastewater Pipelines Capital Works and Estimating Contractors’ Markup","authors":"Rizwan Younis, R. Rehan, A. Unger, Soonyoung Yu, M. Knight","doi":"10.1080/1941658X.2016.1155187","DOIUrl":"https://doi.org/10.1080/1941658X.2016.1155187","url":null,"abstract":"Municipalities and water utilities need to make realistic estimates for the replacement of their aged water and wastewater pipelines. The two main objectives of this article are to present a method to forecast the unit price of water and wastewater pipelines capital works by investigating inflation in their construction price, and to quantify the markup that contractors add to bid a project price. The Geometric Brownian Motion model with drift is used for investigation. Results show that the inflation in water and wastewater pipelines reference projects were 6.41% and 5.52% per annum, respectively. These values compare to the inflation in the Standard & Poor’s/Toronto Stock Exchange (S&P/TSX) Composite Index of 6.93% per annum. In contrast, inflation in Canada’s Consumer Price Index (CPI), and Engineering News-Record’s Construction Cost Index (ENR’s CCI) for Toronto are estimated to be 2.53% and 2.85% per annum, respectively. The spread in the inflation rate between the reference price indices and that of either ENR’s CCI or CPI is a measure of the market price of catchall financial premium (defined as markup) that contractors add to project cost to account for profit, risk, and market conditions. This spread is estimated to be 3.56% and 2.67% per annum for water and wastewater pipeline capital works, respectively.","PeriodicalId":390877,"journal":{"name":"Journal of Cost Analysis and Parametrics","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131082361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}