Can generative artificial intelligence facilitate illustration of- and communication regarding hallucinations and delusions?

IF 5.3 2区 医学 Q1 PSYCHIATRY Acta Psychiatrica Scandinavica Pub Date : 2024-03-15 DOI:10.1111/acps.13680
Søren Dinesen Østergaard
{"title":"Can generative artificial intelligence facilitate illustration of- and communication regarding hallucinations and delusions?","authors":"Søren Dinesen Østergaard","doi":"10.1111/acps.13680","DOIUrl":null,"url":null,"abstract":"<p>In the context of artificial intelligence (AI), the term “hallucinations” is used to refer to false responses generated by conversational agents/chatbots or other generative AI tools (artificial intelligence/machine learning models capable of generating content such as text, speech, images and video).<span><sup>1, 2</sup></span> This metaphor is unfortunate as it is both imprecise and, due to its clear negative connotation, stigmatizing for the many individuals experiencing hallucinations—those with schizophrenia and other psychotic disorders in particular.<span><sup>3</sup></span> While the stigma associated with the misunderstood use of the term hallucinations in relation to generative AI is highly unfortunate, this editorial will propose that it is, however, possible that generative AI may also be helpful and reduce stigma for those experiencing hallucinations and/or delusions.</p><p>For individuals with schizophrenia and other psychotic disorders it can be difficult to communicate the nature and quality of their hallucinations and delusions to relatives, friends as well as to the healthcare professionals involved in their treatment—often due to the recipients not handling this communication well enough.<span><sup>4</sup></span> This may lead to lack of understanding of the suffering associated with these symptoms and could, in turn, contribute to detachment from loved ones, stigma and suboptimal treatment.<span><sup>5</sup></span> Therefore, tools to facilitate communication regarding hallucinations and delusions are sorely needed.</p><p>This editorial will propose that AI tools capable of generating images (e.g., DALL·E<span><sup>6</sup></span>) and video (e.g., Sora<span><sup>7</sup></span>) may be used to facilitate (highly affordable) illustration of- and, thereby, communication regarding hallucinations and delusions experienced by people with schizophrenia and other psychotic disorders. Interestingly, this approach has recently been described in an ophthalmological case report, where Woods and colleagues report on the diagnosing and treatment of a patient with monocular Charles Bonnet syndrome secondary to optic neuritis, where generative AI was used to successfully illustrate the patient's visual hallucination.<span><sup>8</sup></span></p><p>Figure 1 shows three hypothetical examples of hallucinations and delusions illustrated using the version of DALL·E<span><sup>6</sup></span> embedded within ChatGPT-4<span><sup>9</sup></span> at the time of writing—along with the exact prompts that were used to generate the images. Two images of each set of symptoms are shown (read from left to right)—highlighting the importance of the wording of the prompts and the possibility to revise images if they are not accurately portraying the symptoms in question.</p><p>While this is, by no means, a formal assessment of the quality of the generated illustrations or their usefulness, it does seem that this approach has the potential to aid communication regarding hallucinations and delusions between individuals experiencing psychotic symptoms and the people they interact with both privately and in the context of their treatment. Whether this is indeed the case should, of course, be subjected to formal study, as is currently being done for the application of virtual reality interventions to reduce auditory hallucinations—an approach somewhat related to that proposed here—as it involves illustration/creation of an avatar with a voice resembling the hallucinatory voice experienced by the patient.<span><sup>10, 11</sup></span> The hope would be that the results of such studies would document that the facilitated communication regarding hallucinations and delusions due to the AI-generated illustrations would both reduce stigma (showing the illustrations to other people will increase understanding of/demystify psychosis) and improve patient-centered treatment. Indeed, with regard to the latter, it seems likely that the illustrations could be used to provide an exhaustive “map” of a patient's psychotic symptoms in the beginning of the course of treatment, a map which can then be revised as needed over time—and provide the basis for psychometric rating (“how severe has the symptom illustrated by this image been over the past week?”)—thereby facilitating measurement-based care.<span><sup>12-14</sup></span></p><p>With regard to the practical process of generating illustrations of the specific hallucinations/delusions experienced by individuals with schizophrenia and other psychotic disorders, it would seem advisable to have this aided by a healthcare professional with some experience in prompting, for at least three reasons. First, generating images matching the psychotic symptoms described by a patient will require some practice in prompting (“prompt engineering” is becoming a discipline in itself).<span><sup>15</sup></span> Second, it cannot be excluded that some patients may experience worsening of, for example, paranoid thoughts if an image is either too close to his/her symptoms (“how can the AI possibly portray my inner world so well? There must indeed be cameras installed in my apartment!”) or even evoke a new delusion if an image is off target (“I hadn't thought of the possibility that there could also be a spy camera installed in my own camera… I have to destroy it immediately.“).<span><sup>16</sup></span> Therefore, talking the experience of generating the illustrations through with a healthcare professional who, if needed, can explain the basics of how the AI technology works (trained on a tremendous amount of text and images) and resolve any misunderstandings/misinterpretations, will likely be ideal and may even create an empowering sense of co-creation for the patient. Third, there is also a privacy issue to take into account. Specifically, during prompting, personal information potentially linking the individual patient to the portrayed hallucinations and delusions should not be disclosed. The healthcare professional will have the required experience in handling personal information to prevent this from happening.</p><p>In summary, this editorial proposes that image generation using AI may provide highly affordable, yet very valuable, illustrations of the hallucinations and delusions experienced by individuals with schizophrenia and other psychotic disorders. These illustrations may ease the complex communication regarding these symptoms and, thereby, have the potential to reduce stigma and improve the treatment of psychosis.</p><p>There was no funding for this work. Outside this work, SDØ is supported by the Novo Nordisk Foundation (grant number: NNF20SA0062874), the Lundbeck Foundation (grant numbers: R358-2020-2341 and R344-2020-1073), the Danish Cancer Society (grant number: R283-A16461), the Central Denmark Region Fund for Strengthening of Health Science (grant number: 1-36-72-4-20), The Danish Agency for Digitisation Investment Fund for New Technologies (grant number 2020-6720), and Independent Research Fund Denmark (grant number: 7016-00048B and 2096-00055A). These funders had no role in this work, nor in the preparation, review, or approval of the manuscript or the decision to submit for publication.</p><p>SDØ received the 2020 Lundbeck Foundation Young Investigator Prize. SDØ owns/has owned units of mutual funds with stock tickers DKIGI, IAIMWC, SPIC25KL and WEKAFKI, and owns/has owned units of exchange traded funds with stock tickers BATE, TRET, QDV5, QDVH, QDVE, SADM, IQQH, USPY, EXH2, 2B76, IS4S, OM3X and EUNL.</p>","PeriodicalId":108,"journal":{"name":"Acta Psychiatrica Scandinavica","volume":null,"pages":null},"PeriodicalIF":5.3000,"publicationDate":"2024-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/acps.13680","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Acta Psychiatrica Scandinavica","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/acps.13680","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHIATRY","Score":null,"Total":0}
引用次数: 0

Abstract

In the context of artificial intelligence (AI), the term “hallucinations” is used to refer to false responses generated by conversational agents/chatbots or other generative AI tools (artificial intelligence/machine learning models capable of generating content such as text, speech, images and video).1, 2 This metaphor is unfortunate as it is both imprecise and, due to its clear negative connotation, stigmatizing for the many individuals experiencing hallucinations—those with schizophrenia and other psychotic disorders in particular.3 While the stigma associated with the misunderstood use of the term hallucinations in relation to generative AI is highly unfortunate, this editorial will propose that it is, however, possible that generative AI may also be helpful and reduce stigma for those experiencing hallucinations and/or delusions.

For individuals with schizophrenia and other psychotic disorders it can be difficult to communicate the nature and quality of their hallucinations and delusions to relatives, friends as well as to the healthcare professionals involved in their treatment—often due to the recipients not handling this communication well enough.4 This may lead to lack of understanding of the suffering associated with these symptoms and could, in turn, contribute to detachment from loved ones, stigma and suboptimal treatment.5 Therefore, tools to facilitate communication regarding hallucinations and delusions are sorely needed.

This editorial will propose that AI tools capable of generating images (e.g., DALL·E6) and video (e.g., Sora7) may be used to facilitate (highly affordable) illustration of- and, thereby, communication regarding hallucinations and delusions experienced by people with schizophrenia and other psychotic disorders. Interestingly, this approach has recently been described in an ophthalmological case report, where Woods and colleagues report on the diagnosing and treatment of a patient with monocular Charles Bonnet syndrome secondary to optic neuritis, where generative AI was used to successfully illustrate the patient's visual hallucination.8

Figure 1 shows three hypothetical examples of hallucinations and delusions illustrated using the version of DALL·E6 embedded within ChatGPT-49 at the time of writing—along with the exact prompts that were used to generate the images. Two images of each set of symptoms are shown (read from left to right)—highlighting the importance of the wording of the prompts and the possibility to revise images if they are not accurately portraying the symptoms in question.

While this is, by no means, a formal assessment of the quality of the generated illustrations or their usefulness, it does seem that this approach has the potential to aid communication regarding hallucinations and delusions between individuals experiencing psychotic symptoms and the people they interact with both privately and in the context of their treatment. Whether this is indeed the case should, of course, be subjected to formal study, as is currently being done for the application of virtual reality interventions to reduce auditory hallucinations—an approach somewhat related to that proposed here—as it involves illustration/creation of an avatar with a voice resembling the hallucinatory voice experienced by the patient.10, 11 The hope would be that the results of such studies would document that the facilitated communication regarding hallucinations and delusions due to the AI-generated illustrations would both reduce stigma (showing the illustrations to other people will increase understanding of/demystify psychosis) and improve patient-centered treatment. Indeed, with regard to the latter, it seems likely that the illustrations could be used to provide an exhaustive “map” of a patient's psychotic symptoms in the beginning of the course of treatment, a map which can then be revised as needed over time—and provide the basis for psychometric rating (“how severe has the symptom illustrated by this image been over the past week?”)—thereby facilitating measurement-based care.12-14

With regard to the practical process of generating illustrations of the specific hallucinations/delusions experienced by individuals with schizophrenia and other psychotic disorders, it would seem advisable to have this aided by a healthcare professional with some experience in prompting, for at least three reasons. First, generating images matching the psychotic symptoms described by a patient will require some practice in prompting (“prompt engineering” is becoming a discipline in itself).15 Second, it cannot be excluded that some patients may experience worsening of, for example, paranoid thoughts if an image is either too close to his/her symptoms (“how can the AI possibly portray my inner world so well? There must indeed be cameras installed in my apartment!”) or even evoke a new delusion if an image is off target (“I hadn't thought of the possibility that there could also be a spy camera installed in my own camera… I have to destroy it immediately.“).16 Therefore, talking the experience of generating the illustrations through with a healthcare professional who, if needed, can explain the basics of how the AI technology works (trained on a tremendous amount of text and images) and resolve any misunderstandings/misinterpretations, will likely be ideal and may even create an empowering sense of co-creation for the patient. Third, there is also a privacy issue to take into account. Specifically, during prompting, personal information potentially linking the individual patient to the portrayed hallucinations and delusions should not be disclosed. The healthcare professional will have the required experience in handling personal information to prevent this from happening.

In summary, this editorial proposes that image generation using AI may provide highly affordable, yet very valuable, illustrations of the hallucinations and delusions experienced by individuals with schizophrenia and other psychotic disorders. These illustrations may ease the complex communication regarding these symptoms and, thereby, have the potential to reduce stigma and improve the treatment of psychosis.

There was no funding for this work. Outside this work, SDØ is supported by the Novo Nordisk Foundation (grant number: NNF20SA0062874), the Lundbeck Foundation (grant numbers: R358-2020-2341 and R344-2020-1073), the Danish Cancer Society (grant number: R283-A16461), the Central Denmark Region Fund for Strengthening of Health Science (grant number: 1-36-72-4-20), The Danish Agency for Digitisation Investment Fund for New Technologies (grant number 2020-6720), and Independent Research Fund Denmark (grant number: 7016-00048B and 2096-00055A). These funders had no role in this work, nor in the preparation, review, or approval of the manuscript or the decision to submit for publication.

SDØ received the 2020 Lundbeck Foundation Young Investigator Prize. SDØ owns/has owned units of mutual funds with stock tickers DKIGI, IAIMWC, SPIC25KL and WEKAFKI, and owns/has owned units of exchange traded funds with stock tickers BATE, TRET, QDV5, QDVH, QDVE, SADM, IQQH, USPY, EXH2, 2B76, IS4S, OM3X and EUNL.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
生成式人工智能能否促进幻觉和妄想的说明和交流?
15 其次,不排除有些患者会出现妄想症等症状加重的情况,如果图像与他/她的症状过于接近("人工智能怎么可能把我的内心世界描绘得这么好?我的公寓里肯定安装了摄像头!"),或者如果图像偏离目标,甚至会唤起新的妄想("我没有想到我自己的摄像头里也可能安装了间谍摄像头......我必须立即摧毁它。)16 因此,与医疗保健专业人员讨论生成插图的经验,必要时由其解释人工智能技术如何工作的基本原理(经过大量文本和图像的训练),并解决任何误解/曲解,这可能是理想的做法,甚至可能为患者创造一种共同创造的授权感。第三,还需要考虑隐私问题。具体来说,在提示过程中,不应披露可能将患者个人与所描述的幻觉和妄想联系起来的个人信息。总之,这篇社论提出,使用人工智能生成图像可以为精神分裂症和其他精神障碍患者所经历的幻觉和妄想提供价格低廉但非常有价值的图解。这些图解可以缓解有关这些症状的复杂交流,从而有可能减少耻辱感并改善精神病的治疗。此外,SDØ还得到了诺和诺德基金会(资助号:NNF20SA0062874)、灵北基金会(资助号:R358-2020-2341 和 R344-2020-1073)、丹麦癌症协会(资助号:R283-A16461)的支持:R283-A16461)、丹麦中部地区加强健康科学基金(基金号:1-36-72-4-20)、丹麦数字化机构新技术投资基金(基金号:2020-6720)和丹麦独立研究基金(基金号:7016-00048B 和 2096-00055A)。SDØ获得了2020年灵北基金会青年研究员奖。SDØ拥有/曾拥有股票代码为DKIGI、IAIMWC、SPIC25KL和WEKAFKI的共同基金单位,并拥有/曾拥有股票代码为BATE、TRET、QDV5、QDVH、QDVE、SADM、IQQH、USPY、EXH2、2B76、IS4S、OM3X和EUNL的交易所交易基金单位。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Acta Psychiatrica Scandinavica
Acta Psychiatrica Scandinavica 医学-精神病学
CiteScore
11.20
自引率
3.00%
发文量
135
审稿时长
6-12 weeks
期刊介绍: Acta Psychiatrica Scandinavica acts as an international forum for the dissemination of information advancing the science and practice of psychiatry. In particular we focus on communicating frontline research to clinical psychiatrists and psychiatric researchers. Acta Psychiatrica Scandinavica has traditionally been and remains a journal focusing predominantly on clinical psychiatry, but translational psychiatry is a topic of growing importance to our readers. Therefore, the journal welcomes submission of manuscripts based on both clinical- and more translational (e.g. preclinical and epidemiological) research. When preparing manuscripts based on translational studies for submission to Acta Psychiatrica Scandinavica, the authors should place emphasis on the clinical significance of the research question and the findings. Manuscripts based solely on preclinical research (e.g. animal models) are normally not considered for publication in the Journal.
期刊最新文献
Issue Information Variation of subclinical psychosis as a function of population density across different European settings: Findings from the multi-national EU-GEI study. Risk and timing of postpartum depression in parents of twins compared to parents of singletons. Digital phenotyping in bipolar disorder: Using longitudinal Fitbit data and personalized machine learning to predict mood symptomatology. The risk of diabetes and HbA1c deterioration during antipsychotic drug treatment: A Danish two-cohort study among patients with first-episode schizophrenia.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1