{"title":"Ordering effects in discrete choice experiments: A systematic literature review across domains","authors":"Sander Boxebeld","doi":"10.1016/j.jocm.2024.100489","DOIUrl":null,"url":null,"abstract":"<div><p>Discrete choice experiments (DCEs) are increasingly used in several scientific domains. Since their results may be used to inform governmental decision-making, it is important that the validity of the method is continuously scrutinized. An often-studied design artefact is the impact of the presentation order of alternatives, attributes, and choice sets on the results of a DCE. No systematic review of the literature on ordering effects existed until now, and many applied studies using a DCE do not explicitly consider the role of ordering effects. I conducted a systematic review of the literature on ordering effects in this study. Using a three-step snowball sampling strategy, 85 studies were identified across various scientific domains. The majority of included studies documented statistically significant ordering effects. Alternative and attribute ordering effects are primarily caused by lexicographic behaviours, while choice set ordering effects seem to be caused by respondent learning, fatigue, or anchoring. Although ordering effects may not always occur, the majority of studies that did find statistically significant effects warrants the use of mitigation methods. An overview of potential mitigation methods for the applied DCE literature is presented, including randomization of presentation orders, advance disclosure of DCE core elements, and inclusion of alternative-specific constants (ASCs), attribute level overlap, and an instructional choice set (ICS). Finally, several directions for future methodological research on this topic are provided, particularly regarding heterogeneity in ordering effects by study design traits and respondent characteristics, and interactions between ordering effects. Insights in these aspects would further our understanding of respondents’ processing of DCEs.</p></div>","PeriodicalId":46863,"journal":{"name":"Journal of Choice Modelling","volume":"51 ","pages":"Article 100489"},"PeriodicalIF":2.8000,"publicationDate":"2024-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Choice Modelling","FirstCategoryId":"96","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1755534524000216","RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ECONOMICS","Score":null,"Total":0}
引用次数: 0
Abstract
Discrete choice experiments (DCEs) are increasingly used in several scientific domains. Since their results may be used to inform governmental decision-making, it is important that the validity of the method is continuously scrutinized. An often-studied design artefact is the impact of the presentation order of alternatives, attributes, and choice sets on the results of a DCE. No systematic review of the literature on ordering effects existed until now, and many applied studies using a DCE do not explicitly consider the role of ordering effects. I conducted a systematic review of the literature on ordering effects in this study. Using a three-step snowball sampling strategy, 85 studies were identified across various scientific domains. The majority of included studies documented statistically significant ordering effects. Alternative and attribute ordering effects are primarily caused by lexicographic behaviours, while choice set ordering effects seem to be caused by respondent learning, fatigue, or anchoring. Although ordering effects may not always occur, the majority of studies that did find statistically significant effects warrants the use of mitigation methods. An overview of potential mitigation methods for the applied DCE literature is presented, including randomization of presentation orders, advance disclosure of DCE core elements, and inclusion of alternative-specific constants (ASCs), attribute level overlap, and an instructional choice set (ICS). Finally, several directions for future methodological research on this topic are provided, particularly regarding heterogeneity in ordering effects by study design traits and respondent characteristics, and interactions between ordering effects. Insights in these aspects would further our understanding of respondents’ processing of DCEs.