{"title":"工科学生风险评估偏差的估计与决策偏好实验","authors":"Jeremy M. Gernand","doi":"10.1115/1.4055156","DOIUrl":null,"url":null,"abstract":"\n Engineering decisions that have the greatest effect on worker and public safety occur early in the design process. During these decisions, engineers rely on their experience and intuition to estimate the severity and likelihood of undesired future events like failures, equipment damage, injuries, or environmental harm. These initial estimates can then form the basis of investment of limited project resources in mitigating those risks. Behavioral economics suggests that most people make significant and predictable errors when considering high consequence, low probability events. Yet, these biases have not previously been studied quantitatively in the context of engineering decisions. This paper describes results from a set of computer-based engineering assessment and decision experiments with undergraduate engineering students estimating, prioritizing, and making design decisions related to risk. The subjects included in this experiment overestimated the probability of failure, deviated significantly from anticipated risk management preferences, and displayed worsening biases with increasing system complexity. These preliminary results suggest that considerably more effort is needed to understand the characteristics and qualities of these biases in risk estimation and understand what kinds of interventions might best ameliorate these biases and enable engineers to more effectively identify and manage the risks of technology.","PeriodicalId":44694,"journal":{"name":"ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems Part B-Mechanical Engineering","volume":"35 1","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2022-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Set of Estimation and Decision Preference Experiments for Exploring Risk Assessment Biases in Engineering Students\",\"authors\":\"Jeremy M. Gernand\",\"doi\":\"10.1115/1.4055156\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n Engineering decisions that have the greatest effect on worker and public safety occur early in the design process. During these decisions, engineers rely on their experience and intuition to estimate the severity and likelihood of undesired future events like failures, equipment damage, injuries, or environmental harm. These initial estimates can then form the basis of investment of limited project resources in mitigating those risks. Behavioral economics suggests that most people make significant and predictable errors when considering high consequence, low probability events. Yet, these biases have not previously been studied quantitatively in the context of engineering decisions. This paper describes results from a set of computer-based engineering assessment and decision experiments with undergraduate engineering students estimating, prioritizing, and making design decisions related to risk. The subjects included in this experiment overestimated the probability of failure, deviated significantly from anticipated risk management preferences, and displayed worsening biases with increasing system complexity. These preliminary results suggest that considerably more effort is needed to understand the characteristics and qualities of these biases in risk estimation and understand what kinds of interventions might best ameliorate these biases and enable engineers to more effectively identify and manage the risks of technology.\",\"PeriodicalId\":44694,\"journal\":{\"name\":\"ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems Part B-Mechanical Engineering\",\"volume\":\"35 1\",\"pages\":\"\"},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2022-08-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems Part B-Mechanical Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1115/1.4055156\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems Part B-Mechanical Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1115/1.4055156","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
A Set of Estimation and Decision Preference Experiments for Exploring Risk Assessment Biases in Engineering Students
Engineering decisions that have the greatest effect on worker and public safety occur early in the design process. During these decisions, engineers rely on their experience and intuition to estimate the severity and likelihood of undesired future events like failures, equipment damage, injuries, or environmental harm. These initial estimates can then form the basis of investment of limited project resources in mitigating those risks. Behavioral economics suggests that most people make significant and predictable errors when considering high consequence, low probability events. Yet, these biases have not previously been studied quantitatively in the context of engineering decisions. This paper describes results from a set of computer-based engineering assessment and decision experiments with undergraduate engineering students estimating, prioritizing, and making design decisions related to risk. The subjects included in this experiment overestimated the probability of failure, deviated significantly from anticipated risk management preferences, and displayed worsening biases with increasing system complexity. These preliminary results suggest that considerably more effort is needed to understand the characteristics and qualities of these biases in risk estimation and understand what kinds of interventions might best ameliorate these biases and enable engineers to more effectively identify and manage the risks of technology.