Emanuele Borgonovo , Elmar Plischke , Giovanni Rabitti
{"title":"可解释人工智能的众多夏普利值:敏感性分析视角","authors":"Emanuele Borgonovo , Elmar Plischke , Giovanni Rabitti","doi":"10.1016/j.ejor.2024.06.023","DOIUrl":null,"url":null,"abstract":"<div><p>Predictive models are increasingly used for managerial and operational decision-making. The use of complex machine learning algorithms, the growth in computing power, and the increase in data acquisitions have amplified the black-box effects in data science. Consequently, a growing body of literature is investigating methods for interpretability and explainability. We focus on methods based on Shapley values, which are gaining attention as measures of feature importance for explaining black-box predictions. Our analysis follows a hierarchy of value functions, and proves several theoretical properties that connect the indices at the alternative levels. We bridge the notions of totally monotone games and Shapley values, and introduce new interaction indices based on the Shapley-Owen values. The hierarchy evidences synergies that emerge when combining Shapley effects computed at different levels. We then propose a novel sensitivity analysis setting that combines the benefits of both local and global Shapley explanations, which we refer to as the “glocal” approach. We illustrate our integrated approach and discuss the managerial insights it provides in the context of a data-science problem related to health insurance policy-making.</p></div>","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":null,"pages":null},"PeriodicalIF":6.0000,"publicationDate":"2024-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0377221724004715/pdfft?md5=9e0452d284fdc06e6ffd43cf5cd218d1&pid=1-s2.0-S0377221724004715-main.pdf","citationCount":"0","resultStr":"{\"title\":\"The many Shapley values for explainable artificial intelligence: A sensitivity analysis perspective\",\"authors\":\"Emanuele Borgonovo , Elmar Plischke , Giovanni Rabitti\",\"doi\":\"10.1016/j.ejor.2024.06.023\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Predictive models are increasingly used for managerial and operational decision-making. The use of complex machine learning algorithms, the growth in computing power, and the increase in data acquisitions have amplified the black-box effects in data science. Consequently, a growing body of literature is investigating methods for interpretability and explainability. We focus on methods based on Shapley values, which are gaining attention as measures of feature importance for explaining black-box predictions. Our analysis follows a hierarchy of value functions, and proves several theoretical properties that connect the indices at the alternative levels. We bridge the notions of totally monotone games and Shapley values, and introduce new interaction indices based on the Shapley-Owen values. The hierarchy evidences synergies that emerge when combining Shapley effects computed at different levels. We then propose a novel sensitivity analysis setting that combines the benefits of both local and global Shapley explanations, which we refer to as the “glocal” approach. We illustrate our integrated approach and discuss the managerial insights it provides in the context of a data-science problem related to health insurance policy-making.</p></div>\",\"PeriodicalId\":55161,\"journal\":{\"name\":\"European Journal of Operational Research\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2024-06-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S0377221724004715/pdfft?md5=9e0452d284fdc06e6ffd43cf5cd218d1&pid=1-s2.0-S0377221724004715-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"European Journal of Operational Research\",\"FirstCategoryId\":\"91\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0377221724004715\",\"RegionNum\":2,\"RegionCategory\":\"管理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"OPERATIONS RESEARCH & MANAGEMENT SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"European Journal of Operational Research","FirstCategoryId":"91","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0377221724004715","RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"OPERATIONS RESEARCH & MANAGEMENT SCIENCE","Score":null,"Total":0}
The many Shapley values for explainable artificial intelligence: A sensitivity analysis perspective
Predictive models are increasingly used for managerial and operational decision-making. The use of complex machine learning algorithms, the growth in computing power, and the increase in data acquisitions have amplified the black-box effects in data science. Consequently, a growing body of literature is investigating methods for interpretability and explainability. We focus on methods based on Shapley values, which are gaining attention as measures of feature importance for explaining black-box predictions. Our analysis follows a hierarchy of value functions, and proves several theoretical properties that connect the indices at the alternative levels. We bridge the notions of totally monotone games and Shapley values, and introduce new interaction indices based on the Shapley-Owen values. The hierarchy evidences synergies that emerge when combining Shapley effects computed at different levels. We then propose a novel sensitivity analysis setting that combines the benefits of both local and global Shapley explanations, which we refer to as the “glocal” approach. We illustrate our integrated approach and discuss the managerial insights it provides in the context of a data-science problem related to health insurance policy-making.
期刊介绍:
The European Journal of Operational Research (EJOR) publishes high quality, original papers that contribute to the methodology of operational research (OR) and to the practice of decision making.