{"title":"Soft Governance Across Digital Platforms Using Transparency","authors":"Anil R. Doshi, William Schmidt","doi":"10.1287/stsc.2023.0006","DOIUrl":null,"url":null,"abstract":"Platform governance helps align the activities of participating actors to deliver value within the platforms. These platforms can operate in environments where governance is intentionally or conventionally weak in favor of open access, frictionless transactions, or free speech. Such low- or no-governance environments leave room for illegitimate actors to penetrate platforms with illegitimate content or transactions. We propose that an external observer can employ transparency mechanisms to establish “soft” governance that allows participants in a low-governance environment to distinguish between sources of legitimate and illegitimate content. We examine how this might work in the context of disinformation Internet domains by training a machine learning classifier to discern between low-legitimacy from high-legitimacy content providers based on website registration data. The results suggest that an independent observer can employ such a classifier to provide an early, although imperfect, signal of whether a website is intended to host illegitimate content. We show that the independent observer can be effective at serving multiple platforms by providing intermediate prediction results that platforms can align with their unique governance priorities. We expand our analysis with a signaling game model to ascertain whether such a soft governance structure can be resilient to adversarial responses. Funding: Funding for this research was provided by UCL School of Management and Emory University. Supplemental Material: The online appendix is available at https://doi.org/10.1287/stsc.2023.0006 .","PeriodicalId":45295,"journal":{"name":"Strategy Science","volume":null,"pages":null},"PeriodicalIF":2.9000,"publicationDate":"2024-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Strategy Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1287/stsc.2023.0006","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MANAGEMENT","Score":null,"Total":0}
引用次数: 0
Abstract
Platform governance helps align the activities of participating actors to deliver value within the platforms. These platforms can operate in environments where governance is intentionally or conventionally weak in favor of open access, frictionless transactions, or free speech. Such low- or no-governance environments leave room for illegitimate actors to penetrate platforms with illegitimate content or transactions. We propose that an external observer can employ transparency mechanisms to establish “soft” governance that allows participants in a low-governance environment to distinguish between sources of legitimate and illegitimate content. We examine how this might work in the context of disinformation Internet domains by training a machine learning classifier to discern between low-legitimacy from high-legitimacy content providers based on website registration data. The results suggest that an independent observer can employ such a classifier to provide an early, although imperfect, signal of whether a website is intended to host illegitimate content. We show that the independent observer can be effective at serving multiple platforms by providing intermediate prediction results that platforms can align with their unique governance priorities. We expand our analysis with a signaling game model to ascertain whether such a soft governance structure can be resilient to adversarial responses. Funding: Funding for this research was provided by UCL School of Management and Emory University. Supplemental Material: The online appendix is available at https://doi.org/10.1287/stsc.2023.0006 .