{"title":"Can generative artificial intelligence facilitate illustration of- and communication regarding hallucinations and delusions?","authors":"Søren Dinesen Østergaard","doi":"10.1111/acps.13680","DOIUrl":null,"url":null,"abstract":"<p>In the context of artificial intelligence (AI), the term “hallucinations” is used to refer to false responses generated by conversational agents/chatbots or other generative AI tools (artificial intelligence/machine learning models capable of generating content such as text, speech, images and video).<span><sup>1, 2</sup></span> This metaphor is unfortunate as it is both imprecise and, due to its clear negative connotation, stigmatizing for the many individuals experiencing hallucinations—those with schizophrenia and other psychotic disorders in particular.<span><sup>3</sup></span> While the stigma associated with the misunderstood use of the term hallucinations in relation to generative AI is highly unfortunate, this editorial will propose that it is, however, possible that generative AI may also be helpful and reduce stigma for those experiencing hallucinations and/or delusions.</p><p>For individuals with schizophrenia and other psychotic disorders it can be difficult to communicate the nature and quality of their hallucinations and delusions to relatives, friends as well as to the healthcare professionals involved in their treatment—often due to the recipients not handling this communication well enough.<span><sup>4</sup></span> This may lead to lack of understanding of the suffering associated with these symptoms and could, in turn, contribute to detachment from loved ones, stigma and suboptimal treatment.<span><sup>5</sup></span> Therefore, tools to facilitate communication regarding hallucinations and delusions are sorely needed.</p><p>This editorial will propose that AI tools capable of generating images (e.g., DALL·E<span><sup>6</sup></span>) and video (e.g., Sora<span><sup>7</sup></span>) may be used to facilitate (highly affordable) illustration of- and, thereby, communication regarding hallucinations and delusions experienced by people with schizophrenia and other psychotic disorders. Interestingly, this approach has recently been described in an ophthalmological case report, where Woods and colleagues report on the diagnosing and treatment of a patient with monocular Charles Bonnet syndrome secondary to optic neuritis, where generative AI was used to successfully illustrate the patient's visual hallucination.<span><sup>8</sup></span></p><p>Figure 1 shows three hypothetical examples of hallucinations and delusions illustrated using the version of DALL·E<span><sup>6</sup></span> embedded within ChatGPT-4<span><sup>9</sup></span> at the time of writing—along with the exact prompts that were used to generate the images. Two images of each set of symptoms are shown (read from left to right)—highlighting the importance of the wording of the prompts and the possibility to revise images if they are not accurately portraying the symptoms in question.</p><p>While this is, by no means, a formal assessment of the quality of the generated illustrations or their usefulness, it does seem that this approach has the potential to aid communication regarding hallucinations and delusions between individuals experiencing psychotic symptoms and the people they interact with both privately and in the context of their treatment. Whether this is indeed the case should, of course, be subjected to formal study, as is currently being done for the application of virtual reality interventions to reduce auditory hallucinations—an approach somewhat related to that proposed here—as it involves illustration/creation of an avatar with a voice resembling the hallucinatory voice experienced by the patient.<span><sup>10, 11</sup></span> The hope would be that the results of such studies would document that the facilitated communication regarding hallucinations and delusions due to the AI-generated illustrations would both reduce stigma (showing the illustrations to other people will increase understanding of/demystify psychosis) and improve patient-centered treatment. Indeed, with regard to the latter, it seems likely that the illustrations could be used to provide an exhaustive “map” of a patient's psychotic symptoms in the beginning of the course of treatment, a map which can then be revised as needed over time—and provide the basis for psychometric rating (“how severe has the symptom illustrated by this image been over the past week?”)—thereby facilitating measurement-based care.<span><sup>12-14</sup></span></p><p>With regard to the practical process of generating illustrations of the specific hallucinations/delusions experienced by individuals with schizophrenia and other psychotic disorders, it would seem advisable to have this aided by a healthcare professional with some experience in prompting, for at least three reasons. First, generating images matching the psychotic symptoms described by a patient will require some practice in prompting (“prompt engineering” is becoming a discipline in itself).<span><sup>15</sup></span> Second, it cannot be excluded that some patients may experience worsening of, for example, paranoid thoughts if an image is either too close to his/her symptoms (“how can the AI possibly portray my inner world so well? There must indeed be cameras installed in my apartment!”) or even evoke a new delusion if an image is off target (“I hadn't thought of the possibility that there could also be a spy camera installed in my own camera… I have to destroy it immediately.“).<span><sup>16</sup></span> Therefore, talking the experience of generating the illustrations through with a healthcare professional who, if needed, can explain the basics of how the AI technology works (trained on a tremendous amount of text and images) and resolve any misunderstandings/misinterpretations, will likely be ideal and may even create an empowering sense of co-creation for the patient. Third, there is also a privacy issue to take into account. Specifically, during prompting, personal information potentially linking the individual patient to the portrayed hallucinations and delusions should not be disclosed. The healthcare professional will have the required experience in handling personal information to prevent this from happening.</p><p>In summary, this editorial proposes that image generation using AI may provide highly affordable, yet very valuable, illustrations of the hallucinations and delusions experienced by individuals with schizophrenia and other psychotic disorders. These illustrations may ease the complex communication regarding these symptoms and, thereby, have the potential to reduce stigma and improve the treatment of psychosis.</p><p>There was no funding for this work. Outside this work, SDØ is supported by the Novo Nordisk Foundation (grant number: NNF20SA0062874), the Lundbeck Foundation (grant numbers: R358-2020-2341 and R344-2020-1073), the Danish Cancer Society (grant number: R283-A16461), the Central Denmark Region Fund for Strengthening of Health Science (grant number: 1-36-72-4-20), The Danish Agency for Digitisation Investment Fund for New Technologies (grant number 2020-6720), and Independent Research Fund Denmark (grant number: 7016-00048B and 2096-00055A). These funders had no role in this work, nor in the preparation, review, or approval of the manuscript or the decision to submit for publication.</p><p>SDØ received the 2020 Lundbeck Foundation Young Investigator Prize. SDØ owns/has owned units of mutual funds with stock tickers DKIGI, IAIMWC, SPIC25KL and WEKAFKI, and owns/has owned units of exchange traded funds with stock tickers BATE, TRET, QDV5, QDVH, QDVE, SADM, IQQH, USPY, EXH2, 2B76, IS4S, OM3X and EUNL.</p>","PeriodicalId":108,"journal":{"name":"Acta Psychiatrica Scandinavica","volume":null,"pages":null},"PeriodicalIF":5.3000,"publicationDate":"2024-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/acps.13680","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Acta Psychiatrica Scandinavica","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/acps.13680","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHIATRY","Score":null,"Total":0}
引用次数: 0
Abstract
In the context of artificial intelligence (AI), the term “hallucinations” is used to refer to false responses generated by conversational agents/chatbots or other generative AI tools (artificial intelligence/machine learning models capable of generating content such as text, speech, images and video).1, 2 This metaphor is unfortunate as it is both imprecise and, due to its clear negative connotation, stigmatizing for the many individuals experiencing hallucinations—those with schizophrenia and other psychotic disorders in particular.3 While the stigma associated with the misunderstood use of the term hallucinations in relation to generative AI is highly unfortunate, this editorial will propose that it is, however, possible that generative AI may also be helpful and reduce stigma for those experiencing hallucinations and/or delusions.
For individuals with schizophrenia and other psychotic disorders it can be difficult to communicate the nature and quality of their hallucinations and delusions to relatives, friends as well as to the healthcare professionals involved in their treatment—often due to the recipients not handling this communication well enough.4 This may lead to lack of understanding of the suffering associated with these symptoms and could, in turn, contribute to detachment from loved ones, stigma and suboptimal treatment.5 Therefore, tools to facilitate communication regarding hallucinations and delusions are sorely needed.
This editorial will propose that AI tools capable of generating images (e.g., DALL·E6) and video (e.g., Sora7) may be used to facilitate (highly affordable) illustration of- and, thereby, communication regarding hallucinations and delusions experienced by people with schizophrenia and other psychotic disorders. Interestingly, this approach has recently been described in an ophthalmological case report, where Woods and colleagues report on the diagnosing and treatment of a patient with monocular Charles Bonnet syndrome secondary to optic neuritis, where generative AI was used to successfully illustrate the patient's visual hallucination.8
Figure 1 shows three hypothetical examples of hallucinations and delusions illustrated using the version of DALL·E6 embedded within ChatGPT-49 at the time of writing—along with the exact prompts that were used to generate the images. Two images of each set of symptoms are shown (read from left to right)—highlighting the importance of the wording of the prompts and the possibility to revise images if they are not accurately portraying the symptoms in question.
While this is, by no means, a formal assessment of the quality of the generated illustrations or their usefulness, it does seem that this approach has the potential to aid communication regarding hallucinations and delusions between individuals experiencing psychotic symptoms and the people they interact with both privately and in the context of their treatment. Whether this is indeed the case should, of course, be subjected to formal study, as is currently being done for the application of virtual reality interventions to reduce auditory hallucinations—an approach somewhat related to that proposed here—as it involves illustration/creation of an avatar with a voice resembling the hallucinatory voice experienced by the patient.10, 11 The hope would be that the results of such studies would document that the facilitated communication regarding hallucinations and delusions due to the AI-generated illustrations would both reduce stigma (showing the illustrations to other people will increase understanding of/demystify psychosis) and improve patient-centered treatment. Indeed, with regard to the latter, it seems likely that the illustrations could be used to provide an exhaustive “map” of a patient's psychotic symptoms in the beginning of the course of treatment, a map which can then be revised as needed over time—and provide the basis for psychometric rating (“how severe has the symptom illustrated by this image been over the past week?”)—thereby facilitating measurement-based care.12-14
With regard to the practical process of generating illustrations of the specific hallucinations/delusions experienced by individuals with schizophrenia and other psychotic disorders, it would seem advisable to have this aided by a healthcare professional with some experience in prompting, for at least three reasons. First, generating images matching the psychotic symptoms described by a patient will require some practice in prompting (“prompt engineering” is becoming a discipline in itself).15 Second, it cannot be excluded that some patients may experience worsening of, for example, paranoid thoughts if an image is either too close to his/her symptoms (“how can the AI possibly portray my inner world so well? There must indeed be cameras installed in my apartment!”) or even evoke a new delusion if an image is off target (“I hadn't thought of the possibility that there could also be a spy camera installed in my own camera… I have to destroy it immediately.“).16 Therefore, talking the experience of generating the illustrations through with a healthcare professional who, if needed, can explain the basics of how the AI technology works (trained on a tremendous amount of text and images) and resolve any misunderstandings/misinterpretations, will likely be ideal and may even create an empowering sense of co-creation for the patient. Third, there is also a privacy issue to take into account. Specifically, during prompting, personal information potentially linking the individual patient to the portrayed hallucinations and delusions should not be disclosed. The healthcare professional will have the required experience in handling personal information to prevent this from happening.
In summary, this editorial proposes that image generation using AI may provide highly affordable, yet very valuable, illustrations of the hallucinations and delusions experienced by individuals with schizophrenia and other psychotic disorders. These illustrations may ease the complex communication regarding these symptoms and, thereby, have the potential to reduce stigma and improve the treatment of psychosis.
There was no funding for this work. Outside this work, SDØ is supported by the Novo Nordisk Foundation (grant number: NNF20SA0062874), the Lundbeck Foundation (grant numbers: R358-2020-2341 and R344-2020-1073), the Danish Cancer Society (grant number: R283-A16461), the Central Denmark Region Fund for Strengthening of Health Science (grant number: 1-36-72-4-20), The Danish Agency for Digitisation Investment Fund for New Technologies (grant number 2020-6720), and Independent Research Fund Denmark (grant number: 7016-00048B and 2096-00055A). These funders had no role in this work, nor in the preparation, review, or approval of the manuscript or the decision to submit for publication.
SDØ received the 2020 Lundbeck Foundation Young Investigator Prize. SDØ owns/has owned units of mutual funds with stock tickers DKIGI, IAIMWC, SPIC25KL and WEKAFKI, and owns/has owned units of exchange traded funds with stock tickers BATE, TRET, QDV5, QDVH, QDVE, SADM, IQQH, USPY, EXH2, 2B76, IS4S, OM3X and EUNL.
期刊介绍:
Acta Psychiatrica Scandinavica acts as an international forum for the dissemination of information advancing the science and practice of psychiatry. In particular we focus on communicating frontline research to clinical psychiatrists and psychiatric researchers.
Acta Psychiatrica Scandinavica has traditionally been and remains a journal focusing predominantly on clinical psychiatry, but translational psychiatry is a topic of growing importance to our readers. Therefore, the journal welcomes submission of manuscripts based on both clinical- and more translational (e.g. preclinical and epidemiological) research. When preparing manuscripts based on translational studies for submission to Acta Psychiatrica Scandinavica, the authors should place emphasis on the clinical significance of the research question and the findings. Manuscripts based solely on preclinical research (e.g. animal models) are normally not considered for publication in the Journal.