Daniel Z Buchman, Daphne Imahori, Christopher Lo, Katrina Hui, Caroline Walker, James Shaw, Karen D Davis
{"title":"The Influence of Using Novel Predictive Technologies on Judgments of Stigma, Empathy, and Compassion among Healthcare Professionals.","authors":"Daniel Z Buchman, Daphne Imahori, Christopher Lo, Katrina Hui, Caroline Walker, James Shaw, Karen D Davis","doi":"10.1080/21507740.2023.2225470","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Our objective was to evaluate whether the description of a machine learning (ML) app or brain imaging technology to predict the onset of schizophrenia or alcohol use disorder (AUD) influences healthcare professionals' judgments of stigma, empathy, and compassion.</p><p><strong>Methods: </strong>We randomized healthcare professionals (<i>N</i> = 310) to one vignette about a person whose clinician seeks to predict schizophrenia or an AUD, using a ML app, brain imaging, or a psychosocial assessment. Participants used scales to measure their judgments of stigma, empathy, and compassion.</p><p><strong>Results: </strong>Participants randomized to the ML vignette endorsed less anger and more fear relative to the psychosocial vignette, and the brain imaging vignette elicited higher pity ratings. The brain imaging and ML vignettes evoked lower personal responsibility judgments compared to the psychosocial vignette. Physicians and nurses reported less empathy than clinical psychologists.</p><p><strong>Conclusions: </strong>The use of predictive technologies may reinforce essentialist views about mental health and substance use that may increase specific aspects of stigma and reduce others.</p>","PeriodicalId":39022,"journal":{"name":"AJOB Neuroscience","volume":" ","pages":"32-45"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"AJOB Neuroscience","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/21507740.2023.2225470","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/7/14 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"Neuroscience","Score":null,"Total":0}
引用次数: 0
Abstract
Background: Our objective was to evaluate whether the description of a machine learning (ML) app or brain imaging technology to predict the onset of schizophrenia or alcohol use disorder (AUD) influences healthcare professionals' judgments of stigma, empathy, and compassion.
Methods: We randomized healthcare professionals (N = 310) to one vignette about a person whose clinician seeks to predict schizophrenia or an AUD, using a ML app, brain imaging, or a psychosocial assessment. Participants used scales to measure their judgments of stigma, empathy, and compassion.
Results: Participants randomized to the ML vignette endorsed less anger and more fear relative to the psychosocial vignette, and the brain imaging vignette elicited higher pity ratings. The brain imaging and ML vignettes evoked lower personal responsibility judgments compared to the psychosocial vignette. Physicians and nurses reported less empathy than clinical psychologists.
Conclusions: The use of predictive technologies may reinforce essentialist views about mental health and substance use that may increase specific aspects of stigma and reduce others.
背景:我们的目的是评估机器学习(ML)应用程序或脑成像技术预测精神分裂症或酒精使用障碍(AUD)发病的描述是否会影响医护人员对耻辱感、同理心和同情心的判断:我们将医疗保健专业人员(N = 310)随机分配到一个小故事中,该故事讲述的是临床医生试图通过使用 ML 应用程序、脑成像或社会心理评估来预测一个人是否患有精神分裂症或酒精使用障碍。参与者使用量表来测量他们对成见、同情和怜悯的判断:结果:相对于社会心理小故事,随机使用 ML 小故事的参与者的愤怒程度较低,恐惧程度较高,而大脑成像小故事引起的同情评分较高。与社会心理小故事相比,大脑成像和 ML 小故事引起的个人责任判断较低。与临床心理学家相比,医生和护士的同理心较低:结论:预测技术的使用可能会强化对心理健康和药物使用的本质主义观点,从而增加特定方面的成见,减少其他方面的成见。