{"title":"Publish, Share, Re-Tweet, and Repeat","authors":"Michal Lavi","doi":"10.36646/MJLR.54.2.PUBLISH","DOIUrl":null,"url":null,"abstract":"New technologies allow users to communicate ideas to a broad audience easily and quickly, affecting the way ideas are interpreted and their credibility. Each and every social network user can simply click “share” or “retweet” and automatically republish an existing post and expose a new message to a wide audience. The dissemination of ideas can raise public awareness about important issues and bring about social, political, and economic change. Yet, digital sharing also provides vast opportunities to spread false rumors, defamation, and Fake News stories at the thoughtless click of a button. The spreading of falsehoods can severely harm the reputation of victims, erode democracy, and infringe on the public interest. Holding the original publisher accountable and collecting damages from him offers very limited redress since the harmful expression can continue to spread. How should the law respond to this phenomenon and who should be held accountable? Drawing on multidisciplinary social science scholarship from network theory and cognitive psychology, this Article describes how falsehoods spread on social networks, the different motivations to disseminate them, the gravity of the harm they can inflict, and the likelihood of correcting false information once it has been distributed in this setting. This Article will also describe the top-down influence of social media platform intermediaries, and how it enhances dissemination by exploiting users’ cognitive biases and creating social cues that encourage users to share information. Understanding how falsehoods spread is a first step towards providing a framework for meeting this challenge. The Article argues that it is high time to rethink intermediary duties and obligations regarding the dissemination of falsehoods. It examines a new perspective for mitigating the harm caused by the dissemination of falsehood. The Article advocates harnessing social network intermediaries to meet the challenge of dissemination from the stage of platform design. It proposes innovative solutions for mitigating careless, irresponsible sharing of false rumors. The first solution focuses on a platform’s accountability for influencing user decision-making processes. “Nudges” can discourage users from thoughtless sharing of falsehoods and promote accountability ex ante. The second solution * Ph.D. (Law). Research Fellow, Hadar Jabotinsky Center for Interdisciplinary Research of Financial Markets, Crisis and Technology; Postdoctoral Fellow, University of Haifa, Faculty of Law; Cyberlaw Fellow, Federmann Center Hebrew University; Cheshin Fellow, Hebrew University, Faculty of Law, 2018. I thank Michal Shur-Ofry and Emily Cooper. Special thanks are due to Nick Adkins, Hannah Basalone, Nicole Frazer, Sumner Truax, and their colleagues on the University of Michigan Journal of Law Reform staff. I dedicate this Article to the memory of my mother—Aviva Lavi—who died suddenly and unexpectedly. My mother taught me to love knowledge and gave me the strength to pursue it. She will always be loved, remembered, and dearly missed. 442 University of Michigan Journal of Law Reform [Vol. 54:2 focuses on allowing effective ex post facto removal of falsehoods, defamation, and fake news stories from all profiles and locations where they have spread. Shaping user choices and designing platforms is value laden, reflecting the platform’s particular set of preferences, and should not be taken for granted. Therefore, this Article proposes ways to incentivize intermediaries to adopt these solutions and mitigate the harm generated by the spreading of falsehoods. Finally, the Article addresses the limitations of the proposed solutions yet still concludes that they are more effective than current legal practices.","PeriodicalId":83420,"journal":{"name":"University of Michigan journal of law reform. University of Michigan. Law School","volume":"61 1","pages":"441-524"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"University of Michigan journal of law reform. University of Michigan. Law School","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.36646/MJLR.54.2.PUBLISH","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
New technologies allow users to communicate ideas to a broad audience easily and quickly, affecting the way ideas are interpreted and their credibility. Each and every social network user can simply click “share” or “retweet” and automatically republish an existing post and expose a new message to a wide audience. The dissemination of ideas can raise public awareness about important issues and bring about social, political, and economic change. Yet, digital sharing also provides vast opportunities to spread false rumors, defamation, and Fake News stories at the thoughtless click of a button. The spreading of falsehoods can severely harm the reputation of victims, erode democracy, and infringe on the public interest. Holding the original publisher accountable and collecting damages from him offers very limited redress since the harmful expression can continue to spread. How should the law respond to this phenomenon and who should be held accountable? Drawing on multidisciplinary social science scholarship from network theory and cognitive psychology, this Article describes how falsehoods spread on social networks, the different motivations to disseminate them, the gravity of the harm they can inflict, and the likelihood of correcting false information once it has been distributed in this setting. This Article will also describe the top-down influence of social media platform intermediaries, and how it enhances dissemination by exploiting users’ cognitive biases and creating social cues that encourage users to share information. Understanding how falsehoods spread is a first step towards providing a framework for meeting this challenge. The Article argues that it is high time to rethink intermediary duties and obligations regarding the dissemination of falsehoods. It examines a new perspective for mitigating the harm caused by the dissemination of falsehood. The Article advocates harnessing social network intermediaries to meet the challenge of dissemination from the stage of platform design. It proposes innovative solutions for mitigating careless, irresponsible sharing of false rumors. The first solution focuses on a platform’s accountability for influencing user decision-making processes. “Nudges” can discourage users from thoughtless sharing of falsehoods and promote accountability ex ante. The second solution * Ph.D. (Law). Research Fellow, Hadar Jabotinsky Center for Interdisciplinary Research of Financial Markets, Crisis and Technology; Postdoctoral Fellow, University of Haifa, Faculty of Law; Cyberlaw Fellow, Federmann Center Hebrew University; Cheshin Fellow, Hebrew University, Faculty of Law, 2018. I thank Michal Shur-Ofry and Emily Cooper. Special thanks are due to Nick Adkins, Hannah Basalone, Nicole Frazer, Sumner Truax, and their colleagues on the University of Michigan Journal of Law Reform staff. I dedicate this Article to the memory of my mother—Aviva Lavi—who died suddenly and unexpectedly. My mother taught me to love knowledge and gave me the strength to pursue it. She will always be loved, remembered, and dearly missed. 442 University of Michigan Journal of Law Reform [Vol. 54:2 focuses on allowing effective ex post facto removal of falsehoods, defamation, and fake news stories from all profiles and locations where they have spread. Shaping user choices and designing platforms is value laden, reflecting the platform’s particular set of preferences, and should not be taken for granted. Therefore, this Article proposes ways to incentivize intermediaries to adopt these solutions and mitigate the harm generated by the spreading of falsehoods. Finally, the Article addresses the limitations of the proposed solutions yet still concludes that they are more effective than current legal practices.