David Martens , James Hinns , Camille Dams , Mark Vergouwen , Theodoros Evgeniou
{"title":"Tell me a story! Narrative-driven XAI with Large Language Models","authors":"David Martens , James Hinns , Camille Dams , Mark Vergouwen , Theodoros Evgeniou","doi":"10.1016/j.dss.2025.114402","DOIUrl":null,"url":null,"abstract":"<div><div>Existing Explainable AI (XAI) approaches, such as the widely used SHAP values or counterfactual (CF) explanations, are arguably often too technical for users to understand and act upon. To enhance comprehension of explanations of AI decisions and the overall user experience, we introduce XAIstories, which leverage Large Language Models (LLMs) to provide narratives about how AI predictions are made: SHAPstories based on SHAP and CFstories on CF explanations. We study the impact of our approach on users’ experience and understanding of AI predictions. Our results are striking: over 90% of the surveyed general audience finds the narratives generated by SHAPstories convincing, and over 78% for CFstories, in a tabular data experiment. More than 75% of the respondents in an image experiment find CFstories more or equally convincing as their own crafted stories. We also find that the generated stories help users to more accurately summarize and understand AI decisions than they do when only SHAP values are provided. The results indicate that combining LLM generated stories with current XAI methods is a promising and impactful research direction.</div></div>","PeriodicalId":55181,"journal":{"name":"Decision Support Systems","volume":"191 ","pages":"Article 114402"},"PeriodicalIF":6.7000,"publicationDate":"2025-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Decision Support Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S016792362500003X","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Tell me a story! Narrative-driven XAI with Large Language Models
Existing Explainable AI (XAI) approaches, such as the widely used SHAP values or counterfactual (CF) explanations, are arguably often too technical for users to understand and act upon. To enhance comprehension of explanations of AI decisions and the overall user experience, we introduce XAIstories, which leverage Large Language Models (LLMs) to provide narratives about how AI predictions are made: SHAPstories based on SHAP and CFstories on CF explanations. We study the impact of our approach on users’ experience and understanding of AI predictions. Our results are striking: over 90% of the surveyed general audience finds the narratives generated by SHAPstories convincing, and over 78% for CFstories, in a tabular data experiment. More than 75% of the respondents in an image experiment find CFstories more or equally convincing as their own crafted stories. We also find that the generated stories help users to more accurately summarize and understand AI decisions than they do when only SHAP values are provided. The results indicate that combining LLM generated stories with current XAI methods is a promising and impactful research direction.
期刊介绍:
The common thread of articles published in Decision Support Systems is their relevance to theoretical and technical issues in the support of enhanced decision making. The areas addressed may include foundations, functionality, interfaces, implementation, impacts, and evaluation of decision support systems (DSSs).