We study the effect of alternative fee shifting rules on the probability of settlement when the defendant's liability is under dispute. Using a mechanism design approach we demonstrate that the probability of settlement is maximized by a particular Pleadings mechanism: Both parties are given the choice to opt into the mechanism; if they choose to do so, the defendant is asked to plead liable or not. Based on the defendant's pleading the plaintiff is offered a settlement amount which if accepted would be binding to both parties. If the plaintiff refuses the offer, then the case goes to trial and the allocation of litigation costs between the parties is set according to the outcome of the trial and the defendant's pleading of liability. When the background rule for allocation of litigation costs is given by the American rule, we show that the probability of settlement is maximized by requiring the plaintiff to bear both litigants' costs when the defendant has admitted liability irrespective of the outcome of the trial, and by applying the Pro-Plaintiff rule in the event that the defendent has denied liability. Extensions that allow for court inaccuracy, different background rules, variable shares of costs shifted, and detterence are considered.
{"title":"Optimal Design Settlement Devices for Cases of Disputed Liability: Fee-Shifting Rules and Pleadings Mechanisms","authors":"A. Klement, Z. Neeman","doi":"10.2139/ssrn.304200","DOIUrl":"https://doi.org/10.2139/ssrn.304200","url":null,"abstract":"We study the effect of alternative fee shifting rules on the probability of settlement when the defendant's liability is under dispute. Using a mechanism design approach we demonstrate that the probability of settlement is maximized by a particular Pleadings mechanism: Both parties are given the choice to opt into the mechanism; if they choose to do so, the defendant is asked to plead liable or not. Based on the defendant's pleading the plaintiff is offered a settlement amount which if accepted would be binding to both parties. If the plaintiff refuses the offer, then the case goes to trial and the allocation of litigation costs between the parties is set according to the outcome of the trial and the defendant's pleading of liability. When the background rule for allocation of litigation costs is given by the American rule, we show that the probability of settlement is maximized by requiring the plaintiff to bear both litigants' costs when the defendant has admitted liability irrespective of the outcome of the trial, and by applying the Pro-Plaintiff rule in the event that the defendent has denied liability. Extensions that allow for court inaccuracy, different background rules, variable shares of costs shifted, and detterence are considered.","PeriodicalId":168354,"journal":{"name":"Torts & Products Liability Law","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115900474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Paper discusses the relationship between economic action and responsibility. Starting point of the argument is a comparison of positions by Socrates and representatives of modern economics towards the relationship between economy and the environment. As historical example the authors regard the development of the production of chlorofluorocarbons (CFCs), particularly with respect to their role for ozone layer depletion. The second part of the paper investigates the relationship between the occurring phenomenon of joint production and the problem of responsibility on a general scale. Special emphasis is given to ignorance concerning emerging joint products. Finally, general approaches for solutions to the problem of responsibility are examined.
{"title":"Economic Behaviour and Responsibility: The Example of the Soda and Chlorine Industry","authors":"M. Faber, R. Manstetten","doi":"10.2139/ssrn.303699","DOIUrl":"https://doi.org/10.2139/ssrn.303699","url":null,"abstract":"The Paper discusses the relationship between economic action and responsibility. Starting point of the argument is a comparison of positions by Socrates and representatives of modern economics towards the relationship between economy and the environment. As historical example the authors regard the development of the production of chlorofluorocarbons (CFCs), particularly with respect to their role for ozone layer depletion. The second part of the paper investigates the relationship between the occurring phenomenon of joint production and the problem of responsibility on a general scale. Special emphasis is given to ignorance concerning emerging joint products. Finally, general approaches for solutions to the problem of responsibility are examined.","PeriodicalId":168354,"journal":{"name":"Torts & Products Liability Law","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129007540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The law of insurance contracts provides that if the policyholder is shown to have knowingly misrepresented material facts about his risks in his application, the insurer can cancel the contract ex post facto and refuse to pay any claims. This good faith principle is widespread, but implemented with unequal strictness, under common law or statute law. In this paper, we analyze the role of good faith in insurance application, when policyholders are imperfectly informed about their risk type. We extend the Rothschild-Stiglitz (1976) model of an insurance market with adverse selection to the situation where individuals only receive a signal of their risk type and where a costly verification of the individuals' risk type and/or signal is possible. We characterize the optimal investigation strategy of the insurer, and the insurance indemnity that should be paid contingent on the result of the investigation, when the insurance market is at a competitive equilibrium. We show that the high-risk types get full, fair insurance without any investigation. The contract intended for the low-risk types involves probabilistic investigation, either of the signal directly, or of the risk type and then of the signal if a high risk type is revealed, depending on the costs of the two types of investigation and the posterior probability of the signal. In either case, the equilibrium is Pareto superior to that in the original Rothschild-Stiglitz model, and exists for a larger range of the population proportions of the two risk types. We also analyze the issue of the onus of the proof when intentional misrepresentation of risk is alleged by the insurer, and find the dependence of the optimal choice of the legislative rule depends on the rival parties' costs of proving good or bad faith.
{"title":"On the Role of Good Faith in Insurance Contracting","authors":"A. Dixit, P. Picard","doi":"10.2139/ssrn.303841","DOIUrl":"https://doi.org/10.2139/ssrn.303841","url":null,"abstract":"The law of insurance contracts provides that if the policyholder is shown to have knowingly misrepresented material facts about his risks in his application, the insurer can cancel the contract ex post facto and refuse to pay any claims. This good faith principle is widespread, but implemented with unequal strictness, under common law or statute law. In this paper, we analyze the role of good faith in insurance application, when policyholders are imperfectly informed about their risk type. We extend the Rothschild-Stiglitz (1976) model of an insurance market with adverse selection to the situation where individuals only receive a signal of their risk type and where a costly verification of the individuals' risk type and/or signal is possible. We characterize the optimal investigation strategy of the insurer, and the insurance indemnity that should be paid contingent on the result of the investigation, when the insurance market is at a competitive equilibrium. We show that the high-risk types get full, fair insurance without any investigation. The contract intended for the low-risk types involves probabilistic investigation, either of the signal directly, or of the risk type and then of the signal if a high risk type is revealed, depending on the costs of the two types of investigation and the posterior probability of the signal. In either case, the equilibrium is Pareto superior to that in the original Rothschild-Stiglitz model, and exists for a larger range of the population proportions of the two risk types. We also analyze the issue of the onus of the proof when intentional misrepresentation of risk is alleged by the insurer, and find the dependence of the optimal choice of the legislative rule depends on the rival parties' costs of proving good or bad faith.","PeriodicalId":168354,"journal":{"name":"Torts & Products Liability Law","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-01-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125828670","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper reviews the development of the law governing the admissibility of statistical studies. It analyzes the leading cases on scientific evidence and suggests that both the "reliability" and the "general acceptance" standards raise two major difficulties - the "boundary problem" of identifying the type of evidence that warrants careful screening and the "usurpation problem" of keeping the trial judge from closing the gate on evidence that should be left for the jury to assess.The paper proposes partial solutions to these problems, and it applies them to statistical and econometric proof, particularly in the context of a recent antitrust case. It concludes that Daubert-like screening of complex statistical analyses is a salutary development, but that the task requires the elaboration of standards that attend to the distinction between a general methodology and a specific conclusion. Screening statistical proof demands some sophistication in evaluating the choice of a research design or statistical model, the variables included in a particular model, the procedures taken to verify the usefulness of the model for the data at hand, and the inferences or estimates that follow from the statistical analysis. The factors enumerated in Daubert work reasonably well with some of these aspects of the expert's work, but these factors are less well adapted to others. If the "intellectual rigor" standard of Kumho Tire is used to fill the gap, it must be applied with some caution lest it become a subterfuge for excluding expert testimony that is less than ideal but still within the range of reasonable scientific debate.
{"title":"The Dynamics of Daubert: Methodology, Conclusions, and Fit in Statistical and Econometric Studies","authors":"D. Kaye","doi":"10.2307/1073909","DOIUrl":"https://doi.org/10.2307/1073909","url":null,"abstract":"This paper reviews the development of the law governing the admissibility of statistical studies. It analyzes the leading cases on scientific evidence and suggests that both the \"reliability\" and the \"general acceptance\" standards raise two major difficulties - the \"boundary problem\" of identifying the type of evidence that warrants careful screening and the \"usurpation problem\" of keeping the trial judge from closing the gate on evidence that should be left for the jury to assess.The paper proposes partial solutions to these problems, and it applies them to statistical and econometric proof, particularly in the context of a recent antitrust case. It concludes that Daubert-like screening of complex statistical analyses is a salutary development, but that the task requires the elaboration of standards that attend to the distinction between a general methodology and a specific conclusion. Screening statistical proof demands some sophistication in evaluating the choice of a research design or statistical model, the variables included in a particular model, the procedures taken to verify the usefulness of the model for the data at hand, and the inferences or estimates that follow from the statistical analysis. The factors enumerated in Daubert work reasonably well with some of these aspects of the expert's work, but these factors are less well adapted to others. If the \"intellectual rigor\" standard of Kumho Tire is used to fill the gap, it must be applied with some caution lest it become a subterfuge for excluding expert testimony that is less than ideal but still within the range of reasonable scientific debate.","PeriodicalId":168354,"journal":{"name":"Torts & Products Liability Law","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130162375","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Alberini, A. Krupnick, M. Cropper, N. Simon, Joseph Cook
We present results for two contingent valuation surveys conducted in Hamilton, Canada and the US to elicit WTP for mortality risk reductions. We find similar Value of Statistical Life estimates across the two studies, ranging from USD 930,000 to USD 4.8 million (2000 US dollars). WTP increases with risk reduction size, but varies little with respondent age: individuals aged over 70 years hold WTP values approximately one-third lower than other respondents. Respondent health status has limited effect on WTP. These results provide little or no evidence for adjusting VSL estimates used in policy analyses for the affected population’s age or health status.
{"title":"The Willingness to Pay for Mortality Risk Reductions: A Comparison of the United States and Canada","authors":"A. Alberini, A. Krupnick, M. Cropper, N. Simon, Joseph Cook","doi":"10.2139/ssrn.293662","DOIUrl":"https://doi.org/10.2139/ssrn.293662","url":null,"abstract":"We present results for two contingent valuation surveys conducted in Hamilton, Canada and the US to elicit WTP for mortality risk reductions. We find similar Value of Statistical Life estimates across the two studies, ranging from USD 930,000 to USD 4.8 million (2000 US dollars). WTP increases with risk reduction size, but varies little with respondent age: individuals aged over 70 years hold WTP values approximately one-third lower than other respondents. Respondent health status has limited effect on WTP. These results provide little or no evidence for adjusting VSL estimates used in policy analyses for the affected population’s age or health status.","PeriodicalId":168354,"journal":{"name":"Torts & Products Liability Law","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123560040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Economists have argued for two decades that smokers do not transfer costs to non-smokers as smokers, besides paying heavy tobacco taxes, typically die at an earlier age than non-smokers who ring up important old-age care costs. But the more important question is whether there are net social benefits or costs to smoking. World Bank economists have been using very creative welfare economics to argue that tobacco ultimately inflicts a net cost on society and that the optimal consumption level of tobacco is zero. Those arguments conflict with the standard economic presumption that a good freely produced and consumed produces a net social benefit.
{"title":"The World Bank's Tobacco Economics","authors":"P. Lemieux","doi":"10.2139/ssrn.291785","DOIUrl":"https://doi.org/10.2139/ssrn.291785","url":null,"abstract":"Economists have argued for two decades that smokers do not transfer costs to non-smokers as smokers, besides paying heavy tobacco taxes, typically die at an earlier age than non-smokers who ring up important old-age care costs. But the more important question is whether there are net social benefits or costs to smoking. World Bank economists have been using very creative welfare economics to argue that tobacco ultimately inflicts a net cost on society and that the optimal consumption level of tobacco is zero. Those arguments conflict with the standard economic presumption that a good freely produced and consumed produces a net social benefit.","PeriodicalId":168354,"journal":{"name":"Torts & Products Liability Law","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-11-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115707967","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This essay offers a new rationale to the standard of proof requirements in civil trials. The civil proof doctrine, as traditionally understood, presents two economic paradoxes (or anomalies). First, it focuses on accuracy ex post by requiring judges to reconstruct the relevant events, as they unfolded in reality, including the actual damage to the plaintiff, based on the information available at the trial. This retroactive (ex post) accuracy is both expensive and may undermine deterrence along with other economic objectives of the law. For deterrence purposes, only information that had been available to the defendant prior to taking the litigated action (ex ante information) matters. Moreover, accuracy ex post is an investment-dependent opportunity rather than static good. As such, it fosters a secondary market for competitive adversarial investments in information, which might adversely affect the primary market, that is, the market for goods, services, risks and precautions. Thus, when prospective litigants are rationally unwilling to commit themselves to the required investments in information, inefficiencies are bound to occur. In such cases, each party will account for the event that he will be wrongfully defeated in the future trial because his opponent's investment in information outscored his. This prospect will foil transactions that are otherwise efficient and chill many other socially beneficial activities. Second, if the doctrine is nonetheless committed to accuracy ex post, then it should require judges to determine the ultimate probability of the plaintiff's case by multiplying the probabilities of the relevant entitlement, breach and damage. Yet, the doctrine refuses to apply the multiplication principle and thus reduces the total number of correct verdicts, instead of maximizing it. On these grounds, the controlling civil proof doctrine was criticized as economically unsound.Under the new rationale offered by this essay, the two alleged wrongs make a right since in combination they generate a synergetic mechanism that aligns, to the extent feasible, the ex ante and the ex post probabilities of transgression. This alignment is attained by the combined, but not conjunctive, functioning of the two probabilities: the probability of the litigated entitlement and the ex post probability of the entitlement's breach. The entitlement's probability dominates the defendant's ex ante information, thus adjusting the ex ante probability of breach. This adjustment is achieved due to the visibility element, uniformly featured by legal entitlements: under the definition of virtually any entitlement, the entitlement must both exist and be reasonably ascertainable ex ante, that is, at the time and in the circumstances of its breach. The ex post probability of breach has a different function, namely, to substantiate the allegation that the defendant has actually violated the entitlement. This combined framework secures the appropriate alignment betwe
{"title":"Of Two Wrongs that Make a Right: Two Paradoxes of the Evidence Law and Their Combined Economic Justification","authors":"Alex Stein","doi":"10.2139/ssrn.271428","DOIUrl":"https://doi.org/10.2139/ssrn.271428","url":null,"abstract":"This essay offers a new rationale to the standard of proof requirements in civil trials. The civil proof doctrine, as traditionally understood, presents two economic paradoxes (or anomalies). First, it focuses on accuracy ex post by requiring judges to reconstruct the relevant events, as they unfolded in reality, including the actual damage to the plaintiff, based on the information available at the trial. This retroactive (ex post) accuracy is both expensive and may undermine deterrence along with other economic objectives of the law. For deterrence purposes, only information that had been available to the defendant prior to taking the litigated action (ex ante information) matters. Moreover, accuracy ex post is an investment-dependent opportunity rather than static good. As such, it fosters a secondary market for competitive adversarial investments in information, which might adversely affect the primary market, that is, the market for goods, services, risks and precautions. Thus, when prospective litigants are rationally unwilling to commit themselves to the required investments in information, inefficiencies are bound to occur. In such cases, each party will account for the event that he will be wrongfully defeated in the future trial because his opponent's investment in information outscored his. This prospect will foil transactions that are otherwise efficient and chill many other socially beneficial activities. Second, if the doctrine is nonetheless committed to accuracy ex post, then it should require judges to determine the ultimate probability of the plaintiff's case by multiplying the probabilities of the relevant entitlement, breach and damage. Yet, the doctrine refuses to apply the multiplication principle and thus reduces the total number of correct verdicts, instead of maximizing it. On these grounds, the controlling civil proof doctrine was criticized as economically unsound.Under the new rationale offered by this essay, the two alleged wrongs make a right since in combination they generate a synergetic mechanism that aligns, to the extent feasible, the ex ante and the ex post probabilities of transgression. This alignment is attained by the combined, but not conjunctive, functioning of the two probabilities: the probability of the litigated entitlement and the ex post probability of the entitlement's breach. The entitlement's probability dominates the defendant's ex ante information, thus adjusting the ex ante probability of breach. This adjustment is achieved due to the visibility element, uniformly featured by legal entitlements: under the definition of virtually any entitlement, the entitlement must both exist and be reasonably ascertainable ex ante, that is, at the time and in the circumstances of its breach. The ex post probability of breach has a different function, namely, to substantiate the allegation that the defendant has actually violated the entitlement. This combined framework secures the appropriate alignment betwe","PeriodicalId":168354,"journal":{"name":"Torts & Products Liability Law","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114381014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Aggregate road crash costs are traditionally determined using average costs applied to incidence figures found in Police-notified crash data. Such data only comprise a non-random sample of the true population of road crashes, the bias being due to the existence of crashes that are not notified to the Police. The traditional approach is to label the Police-notified sample as 'non-random' thereby casting a cloud over data analyses using this sample. Heckman however viewed similar problems as 'omitted variables' problems in that the exclusion of some observations in a systematic manner (so-called selectivity bias) has inadvertently introduced the need for an additional regressor in the least squares procedures. Using Heckman's methodology for correcting for this selectivity bias, Police-notified crash data for Western Australia in 1987/88 is reconciled with total (notified and not notified) crash data in the estimation of the property damage costs of road crashes.
{"title":"Heckman's Methodology for Correcting Selectivity Bias: An Application to Road Crash Costs","authors":"M. Giles","doi":"10.2139/ssrn.288275","DOIUrl":"https://doi.org/10.2139/ssrn.288275","url":null,"abstract":"Aggregate road crash costs are traditionally determined using average costs applied to incidence figures found in Police-notified crash data. Such data only comprise a non-random sample of the true population of road crashes, the bias being due to the existence of crashes that are not notified to the Police. The traditional approach is to label the Police-notified sample as 'non-random' thereby casting a cloud over data analyses using this sample. Heckman however viewed similar problems as 'omitted variables' problems in that the exclusion of some observations in a systematic manner (so-called selectivity bias) has inadvertently introduced the need for an additional regressor in the least squares procedures. Using Heckman's methodology for correcting for this selectivity bias, Police-notified crash data for Western Australia in 1987/88 is reconciled with total (notified and not notified) crash data in the estimation of the property damage costs of road crashes.","PeriodicalId":168354,"journal":{"name":"Torts & Products Liability Law","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130419148","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper matches establishment-level data on workplace transformation (e.g., quality circles, work teams, and just-in-time production) with measures of cumulative trauma disorders at these same establishments to explore the relationship between 'flexible' workplace practices and workplace health and safety. The results reveal a positive, statistically significant, and quantitatively sizeable relationship between cumulative trauma disorders and the use of quality circles and just-in-time production.
{"title":"'Flexible' Work Practices and Occupational Safety and Health: Exploring the Relationship between Cumulative Trauma Disorders and Workplace Transformation","authors":"Mark D. Brenner, David Fairris, J. Ruser","doi":"10.2139/ssrn.333762","DOIUrl":"https://doi.org/10.2139/ssrn.333762","url":null,"abstract":"This paper matches establishment-level data on workplace transformation (e.g., quality circles, work teams, and just-in-time production) with measures of cumulative trauma disorders at these same establishments to explore the relationship between 'flexible' workplace practices and workplace health and safety. The results reveal a positive, statistically significant, and quantitatively sizeable relationship between cumulative trauma disorders and the use of quality circles and just-in-time production.","PeriodicalId":168354,"journal":{"name":"Torts & Products Liability Law","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121853509","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
To determine the relationship between light trucks and motor vehicle fatalities, we formulated a simultaneous equations model that considered the effect that light truck usage and other variables had on fatality rates over the period 1994 to 1997. We discovered that there is a negative relationship between light truck registration and the motor vehicle fatality rate in both single-vehicle and multiple-vehicle accidents. Our elasticity estimates indicate that the five-percent increase in light truck registrations in the United States over the time period 1994 to 1997 lowered single-vehicle fatalities per driver by 7.5 percent and multiple-vehicle fatalities per driver by two percent. These figures translate into about 2,000 lives saved.
{"title":"The Truth About Light Trucks: Despite Critics' Claims, Suvs are Saving Lives","authors":"James Vanderhoff, D. Coate","doi":"10.2139/ssrn.267074","DOIUrl":"https://doi.org/10.2139/ssrn.267074","url":null,"abstract":"To determine the relationship between light trucks and motor vehicle fatalities, we formulated a simultaneous equations model that considered the effect that light truck usage and other variables had on fatality rates over the period 1994 to 1997. We discovered that there is a negative relationship between light truck registration and the motor vehicle fatality rate in both single-vehicle and multiple-vehicle accidents. Our elasticity estimates indicate that the five-percent increase in light truck registrations in the United States over the time period 1994 to 1997 lowered single-vehicle fatalities per driver by 7.5 percent and multiple-vehicle fatalities per driver by two percent. These figures translate into about 2,000 lives saved.","PeriodicalId":168354,"journal":{"name":"Torts & Products Liability Law","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127528950","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}