{"title":"Calibration of scientific reasoning ability","authors":"Caitlin Drummond Otten, Baruch Fischhoff","doi":"10.1002/bdm.2306","DOIUrl":null,"url":null,"abstract":"<p>Scientific reasoning ability, the ability to reason critically about the quality of scientific evidence, can help laypeople use scientific evidence when making judgments and decisions. We ask whether individuals with greater scientific reasoning ability are also better calibrated with respect to their ability, comparing calibration for skill with the more widely studied calibration for knowledge. In three studies, participants (Study 1: <i>N</i> = 1022; Study 2: <i>N</i> = 101; and Study 3: <i>N</i> = 332) took the Scientific Reasoning Scale (SRS; Drummond & Fischhoff, 2017), comprised of 11 true–false problems, and provided confidence ratings for each problem. Overall, participants were overconfident, reporting mean confidence levels that were 22.4–25% higher than their percentages of correct answers; calibration improved with score. Study 2 found similar calibration patterns for the SRS and another skill, the Cognitive Reflection Test (CRT), measuring the ability to avoid intuitive but incorrect answers. SRS and CRT scores were both associated with success at avoiding negative decision outcomes, as measured by the Decision Outcomes Inventory; confidence on the SRS, above and beyond scores, predicted worse outcomes. Study 3 added an alternative measure of calibration, asking participants to estimate the number of items answered correctly. Participants were less overconfident by this measure. SRS scores predicted correct usage of scientific information in a drug facts box task and holding beliefs consistent with the scientific consensus on controversial issues; confidence, above and beyond SRS scores, predicted worse drug facts box performance but stronger science-consistent beliefs. We discuss the implications of our findings for improving science-relevant decision-making.</p>","PeriodicalId":48112,"journal":{"name":"Journal of Behavioral Decision Making","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2022-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/bdm.2306","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Behavioral Decision Making","FirstCategoryId":"102","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/bdm.2306","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PSYCHOLOGY, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
Scientific reasoning ability, the ability to reason critically about the quality of scientific evidence, can help laypeople use scientific evidence when making judgments and decisions. We ask whether individuals with greater scientific reasoning ability are also better calibrated with respect to their ability, comparing calibration for skill with the more widely studied calibration for knowledge. In three studies, participants (Study 1: N = 1022; Study 2: N = 101; and Study 3: N = 332) took the Scientific Reasoning Scale (SRS; Drummond & Fischhoff, 2017), comprised of 11 true–false problems, and provided confidence ratings for each problem. Overall, participants were overconfident, reporting mean confidence levels that were 22.4–25% higher than their percentages of correct answers; calibration improved with score. Study 2 found similar calibration patterns for the SRS and another skill, the Cognitive Reflection Test (CRT), measuring the ability to avoid intuitive but incorrect answers. SRS and CRT scores were both associated with success at avoiding negative decision outcomes, as measured by the Decision Outcomes Inventory; confidence on the SRS, above and beyond scores, predicted worse outcomes. Study 3 added an alternative measure of calibration, asking participants to estimate the number of items answered correctly. Participants were less overconfident by this measure. SRS scores predicted correct usage of scientific information in a drug facts box task and holding beliefs consistent with the scientific consensus on controversial issues; confidence, above and beyond SRS scores, predicted worse drug facts box performance but stronger science-consistent beliefs. We discuss the implications of our findings for improving science-relevant decision-making.
期刊介绍:
The Journal of Behavioral Decision Making is a multidisciplinary journal with a broad base of content and style. It publishes original empirical reports, critical review papers, theoretical analyses and methodological contributions. The Journal also features book, software and decision aiding technique reviews, abstracts of important articles published elsewhere and teaching suggestions. The objective of the Journal is to present and stimulate behavioral research on decision making and to provide a forum for the evaluation of complementary, contrasting and conflicting perspectives. These perspectives include psychology, management science, sociology, political science and economics. Studies of behavioral decision making in naturalistic and applied settings are encouraged.