绘制德国在线内容节制方面的法律解释图

IF 3.3 3区 社会学 Q1 LAW Computer Law & Security Review Pub Date : 2024-10-09 DOI:10.1016/j.clsr.2024.106054
{"title":"绘制德国在线内容节制方面的法律解释图","authors":"","doi":"10.1016/j.clsr.2024.106054","DOIUrl":null,"url":null,"abstract":"<div><div>Content moderation is a vital condition that online platforms must facilitate, according to the law, to create suitable online environments for their users. By the law, we mean national or European laws that require the removal of content by online platforms, such as EU Regulation 2021/784, which addresses the dissemination of terrorist content online. Content moderation required by these national or European laws, summarised here as ‘the law’, is different from the moderation of pieces of content that is not directly required by law but instead is conducted voluntarily by the platforms. New regulatory requests create an additional layer of complexity of legal grounds for the moderation of content and are relevant to platforms’ daily decisions. The decisions made are either grounded in reasons stemming from different sources of law, such as international or national provisions, or can be based on contractual grounds, such as the platform's Terms of Service and Community Standards. However, how to empirically measure these essential aspects of content moderation remains unclear. Therefore, we ask the following research question: How do online platforms interpret the law when they moderate online content?</div><div>To understand this complex interplay and empirically test the quality of a platform's content moderation claims, this article develops a methodology that facilitates empirical evidence of the individual decisions taken per piece of content while highlighting the subjective element of content classification by human moderators. We then apply this methodology to a single empirical case, an anonymous medium-sized German platform that provided us access to their content moderation decisions. With more knowledge of how platforms interpret the law, we can better understand the complex nature of content moderation, its regulation and compliance practices, and to what degree legal moderation might differ from moderation due to contractual reasons in dimensions such as the need for context, information, and time.</div><div>Our results show considerable divergence between the platform's interpretation of the law and ours. We believe that a significant number of platform legal interpretations are incorrect due to divergent interpretations of the law and that platforms are removing legal content that they falsely believe to be illegal (‘overblocking’) while simultaneously not moderating illegal content (‘underblocking’). In conclusion, we provide recommendations for content moderation system design that takes (legal) human content moderation into account and creates new methodological ways to test its quality and effect on speech in online platforms.</div></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":null,"pages":null},"PeriodicalIF":3.3000,"publicationDate":"2024-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Mapping interpretations of the law in online content moderation in Germany\",\"authors\":\"\",\"doi\":\"10.1016/j.clsr.2024.106054\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Content moderation is a vital condition that online platforms must facilitate, according to the law, to create suitable online environments for their users. By the law, we mean national or European laws that require the removal of content by online platforms, such as EU Regulation 2021/784, which addresses the dissemination of terrorist content online. Content moderation required by these national or European laws, summarised here as ‘the law’, is different from the moderation of pieces of content that is not directly required by law but instead is conducted voluntarily by the platforms. New regulatory requests create an additional layer of complexity of legal grounds for the moderation of content and are relevant to platforms’ daily decisions. The decisions made are either grounded in reasons stemming from different sources of law, such as international or national provisions, or can be based on contractual grounds, such as the platform's Terms of Service and Community Standards. However, how to empirically measure these essential aspects of content moderation remains unclear. Therefore, we ask the following research question: How do online platforms interpret the law when they moderate online content?</div><div>To understand this complex interplay and empirically test the quality of a platform's content moderation claims, this article develops a methodology that facilitates empirical evidence of the individual decisions taken per piece of content while highlighting the subjective element of content classification by human moderators. We then apply this methodology to a single empirical case, an anonymous medium-sized German platform that provided us access to their content moderation decisions. With more knowledge of how platforms interpret the law, we can better understand the complex nature of content moderation, its regulation and compliance practices, and to what degree legal moderation might differ from moderation due to contractual reasons in dimensions such as the need for context, information, and time.</div><div>Our results show considerable divergence between the platform's interpretation of the law and ours. We believe that a significant number of platform legal interpretations are incorrect due to divergent interpretations of the law and that platforms are removing legal content that they falsely believe to be illegal (‘overblocking’) while simultaneously not moderating illegal content (‘underblocking’). In conclusion, we provide recommendations for content moderation system design that takes (legal) human content moderation into account and creates new methodological ways to test its quality and effect on speech in online platforms.</div></div>\",\"PeriodicalId\":51516,\"journal\":{\"name\":\"Computer Law & Security Review\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.3000,\"publicationDate\":\"2024-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Law & Security Review\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0267364924001201\",\"RegionNum\":3,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"LAW\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Law & Security Review","FirstCategoryId":"90","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0267364924001201","RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"LAW","Score":null,"Total":0}
引用次数: 0

摘要

内容审核是网络平台依法为用户创造合适的网络环境所必须具备的重要条件。我们所说的法律是指要求网络平台删除内容的国家或欧洲法律,如欧盟第 2021/784 号法规,该法规涉及恐怖主义内容的网络传播。这些国家或欧洲法律所要求的内容审核(在此被概括为 "法律")不同于法律没有直接要求,而是由平台自愿进行的内容审核。新的监管要求增加了内容审核法律依据的复杂性,与平台的日常决策息息相关。所做的决定要么基于不同法律来源的理由,如国际或国内规定,要么基于合同理由,如平台的服务条款和社区标准。然而,如何对内容审核的这些重要方面进行实证测量仍不明确。因此,我们提出以下研究问题:为了理解这种复杂的相互作用,并对平台内容审核声明的质量进行实证检验,本文开发了一种方法,便于对每篇内容所做的个别决定进行实证,同时强调了人工审核员对内容分类的主观因素。然后,我们将该方法应用于一个单一的实证案例--一个匿名的中型德国平台,该平台向我们提供了其内容审核决定的访问权限。有了更多关于平台如何解释法律的知识,我们就能更好地理解内容审核的复杂性、其监管和合规实践,以及法律审核在多大程度上可能有别于因合同原因而在上下文、信息和时间需求等方面进行的审核。我们认为,由于对法律的解释存在分歧,大量平台的法律解释是不正确的,平台在删除其误认为是非法的合法内容("过度封禁")的同时,并没有对非法内容进行审核("审核不足")。最后,我们对内容审核系统的设计提出了建议,建议将(合法的)人工内容审核考虑在内,并创建新的方法论途径来测试其质量和对网络平台言论的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Mapping interpretations of the law in online content moderation in Germany
Content moderation is a vital condition that online platforms must facilitate, according to the law, to create suitable online environments for their users. By the law, we mean national or European laws that require the removal of content by online platforms, such as EU Regulation 2021/784, which addresses the dissemination of terrorist content online. Content moderation required by these national or European laws, summarised here as ‘the law’, is different from the moderation of pieces of content that is not directly required by law but instead is conducted voluntarily by the platforms. New regulatory requests create an additional layer of complexity of legal grounds for the moderation of content and are relevant to platforms’ daily decisions. The decisions made are either grounded in reasons stemming from different sources of law, such as international or national provisions, or can be based on contractual grounds, such as the platform's Terms of Service and Community Standards. However, how to empirically measure these essential aspects of content moderation remains unclear. Therefore, we ask the following research question: How do online platforms interpret the law when they moderate online content?
To understand this complex interplay and empirically test the quality of a platform's content moderation claims, this article develops a methodology that facilitates empirical evidence of the individual decisions taken per piece of content while highlighting the subjective element of content classification by human moderators. We then apply this methodology to a single empirical case, an anonymous medium-sized German platform that provided us access to their content moderation decisions. With more knowledge of how platforms interpret the law, we can better understand the complex nature of content moderation, its regulation and compliance practices, and to what degree legal moderation might differ from moderation due to contractual reasons in dimensions such as the need for context, information, and time.
Our results show considerable divergence between the platform's interpretation of the law and ours. We believe that a significant number of platform legal interpretations are incorrect due to divergent interpretations of the law and that platforms are removing legal content that they falsely believe to be illegal (‘overblocking’) while simultaneously not moderating illegal content (‘underblocking’). In conclusion, we provide recommendations for content moderation system design that takes (legal) human content moderation into account and creates new methodological ways to test its quality and effect on speech in online platforms.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
5.60
自引率
10.30%
发文量
81
审稿时长
67 days
期刊介绍: CLSR publishes refereed academic and practitioner papers on topics such as Web 2.0, IT security, Identity management, ID cards, RFID, interference with privacy, Internet law, telecoms regulation, online broadcasting, intellectual property, software law, e-commerce, outsourcing, data protection, EU policy, freedom of information, computer security and many other topics. In addition it provides a regular update on European Union developments, national news from more than 20 jurisdictions in both Europe and the Pacific Rim. It is looking for papers within the subject area that display good quality legal analysis and new lines of legal thought or policy development that go beyond mere description of the subject area, however accurate that may be.
期刊最新文献
Procedural fairness in automated asylum procedures: Fundamental rights for fundamental challenges Asia-Pacific developments An Infrastructural Brussels Effect: The translation of EU Law into the UK's digital borders Mapping interpretations of the law in online content moderation in Germany A European right to end-to-end encryption?
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1