{"title":"算法如何促进自我激进化?使用逆向工程方法审计 TikTok 算法","authors":"Donghee Shin, Kulsawasd Jitkajornwanich","doi":"10.1177/08944393231225547","DOIUrl":null,"url":null,"abstract":"Algorithmic radicalization is the idea that algorithms used by social media platforms push people down digital “rabbit holes” by framing personal online activity. Algorithms control what people see and when they see it and learn from their past activities. As such, people gradually and subconsciously adopt the ideas presented to them by the rabbit hole down which they have been pushed. In this study, TikTok’s role in fostering radicalized ideology is examined to offer a critical analysis of the state of radicalism and extremism on platforms. This study conducted an algorithm audit of the role of radicalizing information in social media by examining how TikTok’s algorithms are being used to radicalize, polarize, and spread extremism and societal instability. The results revealed that the pathways through which users access far-right content are manifold and that a large portion of the content can be ascribed to platform recommendations through radicalization pipelines. Algorithms are not simple tools that offer personalized services but rather contributors to radicalism, societal violence, and polarization. Such personalization processes have been instrumental in how artificial intelligence (AI) has been deployed, designed, and used to the detrimental outcomes that it has generated. Thus, the generation and adoption of extreme content on TikTok are, by and large, not only a reflection of user inputs and interactions with the platform but also the platform’s ability to slot users into specific categories and reinforce their ideas.","PeriodicalId":49509,"journal":{"name":"Social Science Computer Review","volume":"26 1","pages":""},"PeriodicalIF":3.0000,"publicationDate":"2024-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"How Algorithms Promote Self-Radicalization: Audit of TikTok’s Algorithm Using a Reverse Engineering Method\",\"authors\":\"Donghee Shin, Kulsawasd Jitkajornwanich\",\"doi\":\"10.1177/08944393231225547\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Algorithmic radicalization is the idea that algorithms used by social media platforms push people down digital “rabbit holes” by framing personal online activity. Algorithms control what people see and when they see it and learn from their past activities. As such, people gradually and subconsciously adopt the ideas presented to them by the rabbit hole down which they have been pushed. In this study, TikTok’s role in fostering radicalized ideology is examined to offer a critical analysis of the state of radicalism and extremism on platforms. This study conducted an algorithm audit of the role of radicalizing information in social media by examining how TikTok’s algorithms are being used to radicalize, polarize, and spread extremism and societal instability. The results revealed that the pathways through which users access far-right content are manifold and that a large portion of the content can be ascribed to platform recommendations through radicalization pipelines. Algorithms are not simple tools that offer personalized services but rather contributors to radicalism, societal violence, and polarization. Such personalization processes have been instrumental in how artificial intelligence (AI) has been deployed, designed, and used to the detrimental outcomes that it has generated. Thus, the generation and adoption of extreme content on TikTok are, by and large, not only a reflection of user inputs and interactions with the platform but also the platform’s ability to slot users into specific categories and reinforce their ideas.\",\"PeriodicalId\":49509,\"journal\":{\"name\":\"Social Science Computer Review\",\"volume\":\"26 1\",\"pages\":\"\"},\"PeriodicalIF\":3.0000,\"publicationDate\":\"2024-07-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Social Science Computer Review\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://doi.org/10.1177/08944393231225547\",\"RegionNum\":2,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Social Science Computer Review","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1177/08944393231225547","RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
How Algorithms Promote Self-Radicalization: Audit of TikTok’s Algorithm Using a Reverse Engineering Method
Algorithmic radicalization is the idea that algorithms used by social media platforms push people down digital “rabbit holes” by framing personal online activity. Algorithms control what people see and when they see it and learn from their past activities. As such, people gradually and subconsciously adopt the ideas presented to them by the rabbit hole down which they have been pushed. In this study, TikTok’s role in fostering radicalized ideology is examined to offer a critical analysis of the state of radicalism and extremism on platforms. This study conducted an algorithm audit of the role of radicalizing information in social media by examining how TikTok’s algorithms are being used to radicalize, polarize, and spread extremism and societal instability. The results revealed that the pathways through which users access far-right content are manifold and that a large portion of the content can be ascribed to platform recommendations through radicalization pipelines. Algorithms are not simple tools that offer personalized services but rather contributors to radicalism, societal violence, and polarization. Such personalization processes have been instrumental in how artificial intelligence (AI) has been deployed, designed, and used to the detrimental outcomes that it has generated. Thus, the generation and adoption of extreme content on TikTok are, by and large, not only a reflection of user inputs and interactions with the platform but also the platform’s ability to slot users into specific categories and reinforce their ideas.
期刊介绍:
Unique Scope Social Science Computer Review is an interdisciplinary journal covering social science instructional and research applications of computing, as well as societal impacts of informational technology. Topics included: artificial intelligence, business, computational social science theory, computer-assisted survey research, computer-based qualitative analysis, computer simulation, economic modeling, electronic modeling, electronic publishing, geographic information systems, instrumentation and research tools, public administration, social impacts of computing and telecommunications, software evaluation, world-wide web resources for social scientists. Interdisciplinary Nature Because the Uses and impacts of computing are interdisciplinary, so is Social Science Computer Review. The journal is of direct relevance to scholars and scientists in a wide variety of disciplines. In its pages you''ll find work in the following areas: sociology, anthropology, political science, economics, psychology, computer literacy, computer applications, and methodology.