Patricio Salas , Rodrigo De la Fuente , Sebastian Astroza , Juan Antonio Carrasco
{"title":"Analysis of attribute importance in multinomial logit models using Shapley values-based methods","authors":"Patricio Salas , Rodrigo De la Fuente , Sebastian Astroza , Juan Antonio Carrasco","doi":"10.1016/j.jocm.2025.100538","DOIUrl":null,"url":null,"abstract":"<div><div>This paper investigates the use of Shapley values-based methods to determine the importance of attributes in discrete choice models, specifically within a Multinomial Logit (MNL) framework. We extend the Shapley decomposition Shorrocks (2013) method from linear models. Additionally, the SHAP method Lundberg and Lee (2017) idea is applied to assess the impact of attributes on individual-level choice probability predictions. A simulation study demonstrates the effectiveness of these approaches under various experimental conditions, including attributes in several ranges and interaction terms. Finally, an empirical application is conducted using well-known travel mode choice datasets. The simulation results show that Shapley values accurately capture the global importance of attributes on goodness-of-fit. The SHAP method provides transparency in MNL model predictions, clarifying how changes in attribute values influence choice probabilities for each decision-maker. These methods offer a complementary perspective to traditional metrics like elasticities and traditional relative importance analysis Orme(2006). In the empirical application, Shapley decomposition highlights the most relevant attributes, while SHAP values uncover individual-level impacts that might not be apparent through elasticities alone. Global and individual-level analysis offers a more comprehensive understanding of attribute importance. In summary, integrating Shapley values with traditional metrics ensures a robust analysis, aiding practitioners and policymakers in making informed decisions based on broad trends and specific impacts.</div></div>","PeriodicalId":46863,"journal":{"name":"Journal of Choice Modelling","volume":"54 ","pages":"Article 100538"},"PeriodicalIF":2.8000,"publicationDate":"2025-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Choice Modelling","FirstCategoryId":"96","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1755534525000016","RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ECONOMICS","Score":null,"Total":0}
引用次数: 0
Abstract
This paper investigates the use of Shapley values-based methods to determine the importance of attributes in discrete choice models, specifically within a Multinomial Logit (MNL) framework. We extend the Shapley decomposition Shorrocks (2013) method from linear models. Additionally, the SHAP method Lundberg and Lee (2017) idea is applied to assess the impact of attributes on individual-level choice probability predictions. A simulation study demonstrates the effectiveness of these approaches under various experimental conditions, including attributes in several ranges and interaction terms. Finally, an empirical application is conducted using well-known travel mode choice datasets. The simulation results show that Shapley values accurately capture the global importance of attributes on goodness-of-fit. The SHAP method provides transparency in MNL model predictions, clarifying how changes in attribute values influence choice probabilities for each decision-maker. These methods offer a complementary perspective to traditional metrics like elasticities and traditional relative importance analysis Orme(2006). In the empirical application, Shapley decomposition highlights the most relevant attributes, while SHAP values uncover individual-level impacts that might not be apparent through elasticities alone. Global and individual-level analysis offers a more comprehensive understanding of attribute importance. In summary, integrating Shapley values with traditional metrics ensures a robust analysis, aiding practitioners and policymakers in making informed decisions based on broad trends and specific impacts.