医学影像中文本到图像生成人工智能的性别和种族偏差,第 2 部分:《DALL-E 3》分析。

IF 1 Q4 RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING Journal of nuclear medicine technology Pub Date : 2024-10-22 DOI:10.2967/jnmt.124.268359
Geoffrey Currie, Johnathan Hewis, Elizabeth Hawk, Eric Rohren
{"title":"医学影像中文本到图像生成人工智能的性别和种族偏差,第 2 部分:《DALL-E 3》分析。","authors":"Geoffrey Currie, Johnathan Hewis, Elizabeth Hawk, Eric Rohren","doi":"10.2967/jnmt.124.268359","DOIUrl":null,"url":null,"abstract":"<p><p>Disparity among gender and ethnicity remains an issue across medicine and health science. Only 26%-35% of trainee radiologists are female, despite more than 50% of medical students' being female. Similar gender disparities are evident across the medical imaging professions. Generative artificial intelligence text-to-image production could reinforce or amplify gender biases. <b>Methods:</b> In March 2024, DALL-E 3 was utilized via GPT-4 to generate a series of individual and group images of medical imaging professionals: radiologist, nuclear medicine physician, radiographer, nuclear medicine technologist, medical physicist, radiopharmacist, and medical imaging nurse. Multiple iterations of images were generated using a variety of prompts. Collectively, 120 images were produced for evaluation of 524 characters. All images were independently analyzed by 3 expert reviewers from medical imaging professions for apparent gender and skin tone. <b>Results:</b> Collectively (individual and group images), 57.4% (<i>n</i> = 301) of medical imaging professionals were depicted as male, 42.4% (<i>n</i> = 222) as female, and 91.2% (<i>n</i> = 478) as having a light skin tone. The male gender representation was 65% for radiologists, 62% for nuclear medicine physicians, 52% for radiographers, 56% for nuclear medicine technologists, 62% for medical physicists, 53% for radiopharmacists, and 26% for medical imaging nurses. For all professions, this overrepresents men compared with women. There was no representation of persons with a disability. <b>Conclusion:</b> This evaluation reveals a significant overrepresentation of the male gender associated with generative artificial intelligence text-to-image production using DALL-E 3 across the medical imaging professions. Generated images have a disproportionately high representation of white men, which is not representative of the diversity of the medical imaging professions.</p>","PeriodicalId":16548,"journal":{"name":"Journal of nuclear medicine technology","volume":null,"pages":null},"PeriodicalIF":1.0000,"publicationDate":"2024-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 2: Analysis of DALL-E 3.\",\"authors\":\"Geoffrey Currie, Johnathan Hewis, Elizabeth Hawk, Eric Rohren\",\"doi\":\"10.2967/jnmt.124.268359\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Disparity among gender and ethnicity remains an issue across medicine and health science. Only 26%-35% of trainee radiologists are female, despite more than 50% of medical students' being female. Similar gender disparities are evident across the medical imaging professions. Generative artificial intelligence text-to-image production could reinforce or amplify gender biases. <b>Methods:</b> In March 2024, DALL-E 3 was utilized via GPT-4 to generate a series of individual and group images of medical imaging professionals: radiologist, nuclear medicine physician, radiographer, nuclear medicine technologist, medical physicist, radiopharmacist, and medical imaging nurse. Multiple iterations of images were generated using a variety of prompts. Collectively, 120 images were produced for evaluation of 524 characters. All images were independently analyzed by 3 expert reviewers from medical imaging professions for apparent gender and skin tone. <b>Results:</b> Collectively (individual and group images), 57.4% (<i>n</i> = 301) of medical imaging professionals were depicted as male, 42.4% (<i>n</i> = 222) as female, and 91.2% (<i>n</i> = 478) as having a light skin tone. The male gender representation was 65% for radiologists, 62% for nuclear medicine physicians, 52% for radiographers, 56% for nuclear medicine technologists, 62% for medical physicists, 53% for radiopharmacists, and 26% for medical imaging nurses. For all professions, this overrepresents men compared with women. There was no representation of persons with a disability. <b>Conclusion:</b> This evaluation reveals a significant overrepresentation of the male gender associated with generative artificial intelligence text-to-image production using DALL-E 3 across the medical imaging professions. Generated images have a disproportionately high representation of white men, which is not representative of the diversity of the medical imaging professions.</p>\",\"PeriodicalId\":16548,\"journal\":{\"name\":\"Journal of nuclear medicine technology\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2024-10-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of nuclear medicine technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2967/jnmt.124.268359\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of nuclear medicine technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2967/jnmt.124.268359","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING","Score":null,"Total":0}
引用次数: 0

摘要

在整个医学和健康科学领域,性别和种族差异仍然是一个问题。只有 26%-35% 的放射科实习医生是女性,尽管 50% 以上的医学生是女性。类似的性别差异在医学影像专业中也很明显。人工智能文本到图像的生成可能会强化或放大性别偏见。方法:2024 年 3 月,通过 GPT-4 使用 DALL-E 3 生成一系列医学影像专业人员的个人和群体图像:放射科医师、核医学医师、放射技师、核医学技师、医学物理学家、放射药剂师和医学影像护士。使用各种提示多次重复生成图像。总共生成了 120 幅图像,用于评估 524 个字符。所有图像均由 3 位医学影像专业的专家评审员进行独立分析,以确定明显的性别和肤色。结果总体而言(个人和群体图像),57.4%(n = 301)的医学影像专业人员为男性,42.4%(n = 222)为女性,91.2%(n = 478)为浅肤色。男性在放射科医生中占 65%,在核医学医生中占 62%,在放射技师中占 52%,在核医学技师中占 56%,在医学物理学家中占 62%,在放射药剂师中占 53%,在医学影像护士中占 26%。在所有职业中,男性的比例都高于女性。没有残疾人代表。结论这项评估显示,在使用 DALL-E 3 生成人工智能文本到图像的过程中,医学影像专业的男性比例明显偏高。生成的图像中白人男性比例过高,这并不代表医学影像专业的多样性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 2: Analysis of DALL-E 3.

Disparity among gender and ethnicity remains an issue across medicine and health science. Only 26%-35% of trainee radiologists are female, despite more than 50% of medical students' being female. Similar gender disparities are evident across the medical imaging professions. Generative artificial intelligence text-to-image production could reinforce or amplify gender biases. Methods: In March 2024, DALL-E 3 was utilized via GPT-4 to generate a series of individual and group images of medical imaging professionals: radiologist, nuclear medicine physician, radiographer, nuclear medicine technologist, medical physicist, radiopharmacist, and medical imaging nurse. Multiple iterations of images were generated using a variety of prompts. Collectively, 120 images were produced for evaluation of 524 characters. All images were independently analyzed by 3 expert reviewers from medical imaging professions for apparent gender and skin tone. Results: Collectively (individual and group images), 57.4% (n = 301) of medical imaging professionals were depicted as male, 42.4% (n = 222) as female, and 91.2% (n = 478) as having a light skin tone. The male gender representation was 65% for radiologists, 62% for nuclear medicine physicians, 52% for radiographers, 56% for nuclear medicine technologists, 62% for medical physicists, 53% for radiopharmacists, and 26% for medical imaging nurses. For all professions, this overrepresents men compared with women. There was no representation of persons with a disability. Conclusion: This evaluation reveals a significant overrepresentation of the male gender associated with generative artificial intelligence text-to-image production using DALL-E 3 across the medical imaging professions. Generated images have a disproportionately high representation of white men, which is not representative of the diversity of the medical imaging professions.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of nuclear medicine technology
Journal of nuclear medicine technology RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING-
CiteScore
1.90
自引率
15.40%
发文量
57
期刊最新文献
A Rare Case of Orbital Metastasis from Invasive Lobular Carcinoma: Challenges of 18F-FDG PET/CT and the Search for Consensus on Imaging. Brain Imaging: Amyloid-β PET. Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 1: Preliminary Evaluation. Gender and Ethnicity Bias of Text-to-Image Generative Artificial Intelligence in Medical Imaging, Part 2: Analysis of DALL-E 3. Comment Regarding "Vapocoolant Analgesia for Breast Lymphoscintigraphy: A Prospective Clinical Trial".
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1