{"title":"“高度细致入微的政策很难大规模实施”:检查研究人员的账户和内容在网上被删除","authors":"Aaron Y. Zelin","doi":"10.1002/poi3.374","DOIUrl":null,"url":null,"abstract":"Abstract Since 2019, researchers examining, archiving, and collecting extremist and terrorist materials online have increasingly been taken offline. In part a consequence of the automation of content moderation by different technology companies and national governments calling for ever quicker takedowns. Based on an online survey of peers in the field, this research highlights that up to 60% of researchers surveyed have had either their accounts or content they have posted or stored online taken down from varying platforms. Beyond the quantitative data, this research also garnered qualitative answers about concerns individuals in the field had related to this problem set, namely, the lack of transparency on the part of the technology companies, hindering actual research and understanding of complicated and evolving issues related to different extremist and terrorist phenomena, undermining potential collaboration within the research field, and the potential of self‐censorship online. An easy solution to this would be a whitelist, though there are inherent downsides related to this as well, especially between researchers at different levels in their careers, institutional affiliation or lack thereof, and inequalities between researchers from the West versus Global South. Either way, securitizing research in however form it evolves in the future will fundamentally hurt research.","PeriodicalId":46894,"journal":{"name":"Policy and Internet","volume":"6 1","pages":"0"},"PeriodicalIF":4.1000,"publicationDate":"2023-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"“Highly nuanced policy is very difficult to apply at scale”: Examining researcher account and content takedowns online\",\"authors\":\"Aaron Y. Zelin\",\"doi\":\"10.1002/poi3.374\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Since 2019, researchers examining, archiving, and collecting extremist and terrorist materials online have increasingly been taken offline. In part a consequence of the automation of content moderation by different technology companies and national governments calling for ever quicker takedowns. Based on an online survey of peers in the field, this research highlights that up to 60% of researchers surveyed have had either their accounts or content they have posted or stored online taken down from varying platforms. Beyond the quantitative data, this research also garnered qualitative answers about concerns individuals in the field had related to this problem set, namely, the lack of transparency on the part of the technology companies, hindering actual research and understanding of complicated and evolving issues related to different extremist and terrorist phenomena, undermining potential collaboration within the research field, and the potential of self‐censorship online. An easy solution to this would be a whitelist, though there are inherent downsides related to this as well, especially between researchers at different levels in their careers, institutional affiliation or lack thereof, and inequalities between researchers from the West versus Global South. Either way, securitizing research in however form it evolves in the future will fundamentally hurt research.\",\"PeriodicalId\":46894,\"journal\":{\"name\":\"Policy and Internet\",\"volume\":\"6 1\",\"pages\":\"0\"},\"PeriodicalIF\":4.1000,\"publicationDate\":\"2023-11-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Policy and Internet\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1002/poi3.374\",\"RegionNum\":1,\"RegionCategory\":\"文学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMMUNICATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Policy and Internet","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/poi3.374","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMMUNICATION","Score":null,"Total":0}
“Highly nuanced policy is very difficult to apply at scale”: Examining researcher account and content takedowns online
Abstract Since 2019, researchers examining, archiving, and collecting extremist and terrorist materials online have increasingly been taken offline. In part a consequence of the automation of content moderation by different technology companies and national governments calling for ever quicker takedowns. Based on an online survey of peers in the field, this research highlights that up to 60% of researchers surveyed have had either their accounts or content they have posted or stored online taken down from varying platforms. Beyond the quantitative data, this research also garnered qualitative answers about concerns individuals in the field had related to this problem set, namely, the lack of transparency on the part of the technology companies, hindering actual research and understanding of complicated and evolving issues related to different extremist and terrorist phenomena, undermining potential collaboration within the research field, and the potential of self‐censorship online. An easy solution to this would be a whitelist, though there are inherent downsides related to this as well, especially between researchers at different levels in their careers, institutional affiliation or lack thereof, and inequalities between researchers from the West versus Global South. Either way, securitizing research in however form it evolves in the future will fundamentally hurt research.
期刊介绍:
Understanding public policy in the age of the Internet requires understanding how individuals, organizations, governments and networks behave, and what motivates them in this new environment. Technological innovation and internet-mediated interaction raise both challenges and opportunities for public policy: whether in areas that have received much work already (e.g. digital divides, digital government, and privacy) or newer areas, like regulation of data-intensive technologies and platforms, the rise of precarious labour, and regulatory responses to misinformation and hate speech. We welcome innovative research in areas where the Internet already impacts public policy, where it raises new challenges or dilemmas, or provides opportunities for policy that is smart and equitable. While we welcome perspectives from any academic discipline, we look particularly for insight that can feed into social science disciplines like political science, public administration, economics, sociology, and communication. We welcome articles that introduce methodological innovation, theoretical development, or rigorous data analysis concerning a particular question or problem of public policy.