算法鸿沟:对医疗保健中人工智能驱动的种族差异的系统回顾。

IF 3.2 3区 医学 Q2 PUBLIC, ENVIRONMENTAL & OCCUPATIONAL HEALTH Journal of Racial and Ethnic Health Disparities Pub Date : 2024-12-18 DOI:10.1007/s40615-024-02237-0
Syed Ali Haider, Sahar Borna, Cesar A Gomez-Cabello, Sophia M Pressman, Clifton R Haider, Antonio Jorge Forte
{"title":"算法鸿沟:对医疗保健中人工智能驱动的种族差异的系统回顾。","authors":"Syed Ali Haider, Sahar Borna, Cesar A Gomez-Cabello, Sophia M Pressman, Clifton R Haider, Antonio Jorge Forte","doi":"10.1007/s40615-024-02237-0","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>As artificial intelligence (AI) continues to permeate various sectors, concerns about disparities arising from its deployment have surfaced. AI's effectiveness correlates not only with the algorithm's quality but also with its training data's integrity. This systematic review investigates the racial disparities perpetuated by AI systems across diverse medical domains and the implications of deploying them, particularly in healthcare.</p><p><strong>Methods: </strong>Six electronic databases (PubMed, Scopus, IEEE, Google Scholar, EMBASE, and Cochrane) were systematically searched on October 3, 2023. Inclusion criteria were peer-reviewed articles in English from 2013 to 2023 that examined instances of racial bias perpetuated by AI in healthcare. Studies conducted outside of healthcare settings or that addressed biases other than racial, as well as letters, opinions were excluded. The risk of bias was identified using CASP criteria for reviews and the Modified Newcastle Scale for observational studies.</p><p><strong>Results: </strong>Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, 1272 articles were initially identified, from which 26 met eligibility criteria. Four articles were identified via snowballing, resulting in 30 articles in the analysis. Studies indicate a significant association between AI utilization and the exacerbation of racial disparities, especially in minority populations, including Blacks and Hispanics. Biased data, algorithm design, unfair deployment of algorithms, and historic/systemic inequities were identified as the causes. Study limitations stem from heterogeneity impeding broad comparisons and the preclusion of meta-analysis.</p><p><strong>Conclusion: </strong>To address racial disparities in healthcare outcomes, enhanced ethical considerations and regulatory frameworks are needed in AI healthcare applications. Comprehensive bias detection tools and mitigation strategies, coupled with active supervision by physicians, are essential to ensure AI becomes a tool for reducing racial disparities in healthcare outcomes.</p>","PeriodicalId":16921,"journal":{"name":"Journal of Racial and Ethnic Health Disparities","volume":" ","pages":""},"PeriodicalIF":3.2000,"publicationDate":"2024-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Algorithmic Divide: A Systematic Review on AI-Driven Racial Disparities in Healthcare.\",\"authors\":\"Syed Ali Haider, Sahar Borna, Cesar A Gomez-Cabello, Sophia M Pressman, Clifton R Haider, Antonio Jorge Forte\",\"doi\":\"10.1007/s40615-024-02237-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Introduction: </strong>As artificial intelligence (AI) continues to permeate various sectors, concerns about disparities arising from its deployment have surfaced. AI's effectiveness correlates not only with the algorithm's quality but also with its training data's integrity. This systematic review investigates the racial disparities perpetuated by AI systems across diverse medical domains and the implications of deploying them, particularly in healthcare.</p><p><strong>Methods: </strong>Six electronic databases (PubMed, Scopus, IEEE, Google Scholar, EMBASE, and Cochrane) were systematically searched on October 3, 2023. Inclusion criteria were peer-reviewed articles in English from 2013 to 2023 that examined instances of racial bias perpetuated by AI in healthcare. Studies conducted outside of healthcare settings or that addressed biases other than racial, as well as letters, opinions were excluded. The risk of bias was identified using CASP criteria for reviews and the Modified Newcastle Scale for observational studies.</p><p><strong>Results: </strong>Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, 1272 articles were initially identified, from which 26 met eligibility criteria. Four articles were identified via snowballing, resulting in 30 articles in the analysis. Studies indicate a significant association between AI utilization and the exacerbation of racial disparities, especially in minority populations, including Blacks and Hispanics. Biased data, algorithm design, unfair deployment of algorithms, and historic/systemic inequities were identified as the causes. Study limitations stem from heterogeneity impeding broad comparisons and the preclusion of meta-analysis.</p><p><strong>Conclusion: </strong>To address racial disparities in healthcare outcomes, enhanced ethical considerations and regulatory frameworks are needed in AI healthcare applications. Comprehensive bias detection tools and mitigation strategies, coupled with active supervision by physicians, are essential to ensure AI becomes a tool for reducing racial disparities in healthcare outcomes.</p>\",\"PeriodicalId\":16921,\"journal\":{\"name\":\"Journal of Racial and Ethnic Health Disparities\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":3.2000,\"publicationDate\":\"2024-12-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Racial and Ethnic Health Disparities\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1007/s40615-024-02237-0\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"PUBLIC, ENVIRONMENTAL & OCCUPATIONAL HEALTH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Racial and Ethnic Health Disparities","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1007/s40615-024-02237-0","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PUBLIC, ENVIRONMENTAL & OCCUPATIONAL HEALTH","Score":null,"Total":0}
引用次数: 0

摘要

导语:随着人工智能(AI)不断渗透到各个领域,人们对其部署所产生的差异的担忧已经浮出水面。人工智能的有效性不仅与算法的质量有关,还与训练数据的完整性有关。这篇系统综述调查了人工智能系统在不同医疗领域持续存在的种族差异,以及部署它们的影响,特别是在医疗保健领域。方法:于2023年10月3日系统检索PubMed、Scopus、IEEE、谷歌Scholar、EMBASE和Cochrane 6个电子数据库。纳入标准是2013年至2023年期间同行评议的英文文章,这些文章研究了人工智能在医疗保健领域造成的种族偏见。在医疗保健环境之外进行的研究或解决种族以外的偏见的研究以及信件、意见被排除在外。偏倚风险是用CASP评价标准和观察性研究的改良纽卡斯尔量表来确定的。结果:根据系统评价和荟萃分析的首选报告项目(PRISMA)指南,最初确定了1272篇文章,其中26篇符合资格标准。通过滚雪球法确定了4篇文章,共分析了30篇文章。研究表明,人工智能的使用与种族差异的加剧之间存在显著关联,特别是在少数民族人群中,包括黑人和西班牙裔。有偏见的数据、算法设计、不公平的算法部署以及历史/系统的不公平被认为是原因。研究的局限性源于异质性阻碍了广泛的比较和排除了荟萃分析。结论:为了解决医疗结果中的种族差异,需要在人工智能医疗应用中加强伦理考虑和监管框架。全面的偏见检测工具和缓解策略,加上医生的积极监督,对于确保人工智能成为减少医疗保健结果中的种族差异的工具至关重要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
The Algorithmic Divide: A Systematic Review on AI-Driven Racial Disparities in Healthcare.

Introduction: As artificial intelligence (AI) continues to permeate various sectors, concerns about disparities arising from its deployment have surfaced. AI's effectiveness correlates not only with the algorithm's quality but also with its training data's integrity. This systematic review investigates the racial disparities perpetuated by AI systems across diverse medical domains and the implications of deploying them, particularly in healthcare.

Methods: Six electronic databases (PubMed, Scopus, IEEE, Google Scholar, EMBASE, and Cochrane) were systematically searched on October 3, 2023. Inclusion criteria were peer-reviewed articles in English from 2013 to 2023 that examined instances of racial bias perpetuated by AI in healthcare. Studies conducted outside of healthcare settings or that addressed biases other than racial, as well as letters, opinions were excluded. The risk of bias was identified using CASP criteria for reviews and the Modified Newcastle Scale for observational studies.

Results: Following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, 1272 articles were initially identified, from which 26 met eligibility criteria. Four articles were identified via snowballing, resulting in 30 articles in the analysis. Studies indicate a significant association between AI utilization and the exacerbation of racial disparities, especially in minority populations, including Blacks and Hispanics. Biased data, algorithm design, unfair deployment of algorithms, and historic/systemic inequities were identified as the causes. Study limitations stem from heterogeneity impeding broad comparisons and the preclusion of meta-analysis.

Conclusion: To address racial disparities in healthcare outcomes, enhanced ethical considerations and regulatory frameworks are needed in AI healthcare applications. Comprehensive bias detection tools and mitigation strategies, coupled with active supervision by physicians, are essential to ensure AI becomes a tool for reducing racial disparities in healthcare outcomes.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Racial and Ethnic Health Disparities
Journal of Racial and Ethnic Health Disparities PUBLIC, ENVIRONMENTAL & OCCUPATIONAL HEALTH-
CiteScore
7.30
自引率
5.10%
发文量
263
期刊介绍: Journal of Racial and Ethnic Health Disparities reports on the scholarly progress of work to understand, address, and ultimately eliminate health disparities based on race and ethnicity. Efforts to explore underlying causes of health disparities and to describe interventions that have been undertaken to address racial and ethnic health disparities are featured. Promising studies that are ongoing or studies that have longer term data are welcome, as are studies that serve as lessons for best practices in eliminating health disparities. Original research, systematic reviews, and commentaries presenting the state-of-the-art thinking on problems centered on health disparities will be considered for publication. We particularly encourage review articles that generate innovative and testable ideas, and constructive discussions and/or critiques of health disparities.Because the Journal of Racial and Ethnic Health Disparities receives a large number of submissions, about 30% of submissions to the Journal are sent out for full peer review.
期刊最新文献
Impact of Racial Bias on Providers' Empathic Communication Behaviors with Women of Color in Postpartum Checkup. Correction: Implicit Racial Bias in Evaluation of Neonatal Opioid Withdrawal Syndrome. Factors Associated with Self-reported COVID-19 Infection and Hospitalization among Patients Seeking Care at a Comprehensive Cancer Center. The Robust Relation of Microaggressions with Alcohol-Related Problems Among Black Individuals Who Use Alcohol: the Role of Drinking to Cope with Negative Affect. Vicarious Racism, Direct Racism, and Mental Health Among Racialized Minority Healthcare Workers.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1