制度化的算法执行——欧盟UGC平台责任方法的利弊

Martin Senftleben
{"title":"制度化的算法执行——欧盟UGC平台责任方法的利弊","authors":"Martin Senftleben","doi":"10.25148/LAWREV.14.2.11","DOIUrl":null,"url":null,"abstract":"Algorithmic copyright enforcement – the use of automated filtering tools to detect infringing content before it appears on the internet – has a deep impact on the freedom of users to upload and share information. Instead of presuming that user-generated content (\"UGC\") does not amount to infringement unless copyright owners take action and provide proof, the default position of automated filtering systems is that every upload is suspicious and that copyright owners are entitled to ex ante control over the sharing of information online. If platform providers voluntarily introduce algorithmic enforcement measures, this may be seen as a private decision following from the freedom of companies to run their business as they wish. If, however, copyright legislation institutionalizes algorithmic enforcement and imposes a legal obligation on platform providers to employ automated filtering tools, the law itself transforms copyright into a censorship and filtering instrument. Nonetheless, the new EU Directive on Copyright in the Digital Single Market (“DSM Directive”) follows this path and requires the employment of automated filtering tools to ensure that unauthorized protected content does not populate UGC platforms. The new EU rules on UGC licensing and screening will inevitably lead to the adoption of algorithmic enforcement measures in practice. Without automated content control, UGC platforms will be unable to escape liability for infringing user uploads. To provide a complete picture, however, it is important to also shed light on counterbalances which may distinguish this new, institutionalized form of algorithmic enforcement from known content filtering tools that have evolved as voluntary measures in the private sector. The DSM Directive underlines the necessity to safeguard user freedoms that support transformative, creative remixes and mash-ups of pre-existing content. This feature of the new legislation may offer important incentives to develop algorithmic tools that go beyond the mere identification of unauthorized takings from protected works. It has the potential to encourage content assessment mechanisms that factor the degree of transformative effort and user creativity into the equation. As a result, more balanced content filtering tools may emerge in the EU. Against this background, the analysis shows that the new EU legislation not only escalates the use of algorithmic enforcement measures that already commenced in the private sector years ago. If rightly implemented, it may also add an important nuance to existing content identification tools and alleviate the problems arising from reliance on automated filtering mechanisms.","PeriodicalId":300333,"journal":{"name":"FIU Law Review","volume":"187 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Institutionalized Algorithmic Enforcement—The Pros and Cons of the EU Approach to UGC Platform Liability\",\"authors\":\"Martin Senftleben\",\"doi\":\"10.25148/LAWREV.14.2.11\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Algorithmic copyright enforcement – the use of automated filtering tools to detect infringing content before it appears on the internet – has a deep impact on the freedom of users to upload and share information. Instead of presuming that user-generated content (\\\"UGC\\\") does not amount to infringement unless copyright owners take action and provide proof, the default position of automated filtering systems is that every upload is suspicious and that copyright owners are entitled to ex ante control over the sharing of information online. If platform providers voluntarily introduce algorithmic enforcement measures, this may be seen as a private decision following from the freedom of companies to run their business as they wish. If, however, copyright legislation institutionalizes algorithmic enforcement and imposes a legal obligation on platform providers to employ automated filtering tools, the law itself transforms copyright into a censorship and filtering instrument. Nonetheless, the new EU Directive on Copyright in the Digital Single Market (“DSM Directive”) follows this path and requires the employment of automated filtering tools to ensure that unauthorized protected content does not populate UGC platforms. The new EU rules on UGC licensing and screening will inevitably lead to the adoption of algorithmic enforcement measures in practice. Without automated content control, UGC platforms will be unable to escape liability for infringing user uploads. To provide a complete picture, however, it is important to also shed light on counterbalances which may distinguish this new, institutionalized form of algorithmic enforcement from known content filtering tools that have evolved as voluntary measures in the private sector. The DSM Directive underlines the necessity to safeguard user freedoms that support transformative, creative remixes and mash-ups of pre-existing content. This feature of the new legislation may offer important incentives to develop algorithmic tools that go beyond the mere identification of unauthorized takings from protected works. It has the potential to encourage content assessment mechanisms that factor the degree of transformative effort and user creativity into the equation. As a result, more balanced content filtering tools may emerge in the EU. Against this background, the analysis shows that the new EU legislation not only escalates the use of algorithmic enforcement measures that already commenced in the private sector years ago. If rightly implemented, it may also add an important nuance to existing content identification tools and alleviate the problems arising from reliance on automated filtering mechanisms.\",\"PeriodicalId\":300333,\"journal\":{\"name\":\"FIU Law Review\",\"volume\":\"187 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"FIU Law Review\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.25148/LAWREV.14.2.11\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"FIU Law Review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.25148/LAWREV.14.2.11","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

摘要

算法版权执法——使用自动过滤工具在侵权内容出现在互联网上之前检测出来——对用户上传和分享信息的自由产生了深远的影响。自动过滤系统的默认立场是,每次上传都是可疑的,而版权所有者有权事先控制网上信息的共享,而不是假设用户生成的内容("UGC")不构成侵权,除非版权所有者采取行动并提供证据。如果平台提供商自愿引入算法强制措施,这可能会被视为公司按照自己的意愿经营业务的自由之后的私人决定。然而,如果版权立法将算法的执行制度化,并要求平台提供商承担使用自动过滤工具的法律义务,那么法律本身就会将版权转变为审查和过滤工具。尽管如此,新的欧盟数字单一市场版权指令(“DSM指令”)遵循了这条道路,并要求使用自动过滤工具来确保未经授权的受保护内容不会出现在UGC平台上。欧盟关于UGC许可和筛选的新规则将不可避免地导致在实践中采取算法强制措施。如果没有自动化的内容控制,UGC平台将无法逃避侵犯用户上传内容的责任。然而,为了提供一个完整的图景,重要的是还要阐明平衡,这些平衡可能会将这种新的、制度化的算法执行形式与已知的内容过滤工具区分开来,这些工具已经演变为私营部门的自愿措施。DSM指令强调了保护用户自由的必要性,这些自由支持对已有内容进行变革性、创造性的重新混合和混搭。新立法的这一特点可能会为开发算法工具提供重要的激励,而不仅仅是识别从受保护作品中未经授权的收入。它有可能鼓励内容评估机制,将变革努力的程度和用户创造力纳入等式。因此,欧盟可能会出现更平衡的内容过滤工具。在此背景下,分析表明,新的欧盟立法不仅升级了多年前已经在私营部门开始使用的算法执法措施。如果实现得当,它还可以为现有的内容识别工具添加重要的细微差别,并减轻依赖自动过滤机制而产生的问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Institutionalized Algorithmic Enforcement—The Pros and Cons of the EU Approach to UGC Platform Liability
Algorithmic copyright enforcement – the use of automated filtering tools to detect infringing content before it appears on the internet – has a deep impact on the freedom of users to upload and share information. Instead of presuming that user-generated content ("UGC") does not amount to infringement unless copyright owners take action and provide proof, the default position of automated filtering systems is that every upload is suspicious and that copyright owners are entitled to ex ante control over the sharing of information online. If platform providers voluntarily introduce algorithmic enforcement measures, this may be seen as a private decision following from the freedom of companies to run their business as they wish. If, however, copyright legislation institutionalizes algorithmic enforcement and imposes a legal obligation on platform providers to employ automated filtering tools, the law itself transforms copyright into a censorship and filtering instrument. Nonetheless, the new EU Directive on Copyright in the Digital Single Market (“DSM Directive”) follows this path and requires the employment of automated filtering tools to ensure that unauthorized protected content does not populate UGC platforms. The new EU rules on UGC licensing and screening will inevitably lead to the adoption of algorithmic enforcement measures in practice. Without automated content control, UGC platforms will be unable to escape liability for infringing user uploads. To provide a complete picture, however, it is important to also shed light on counterbalances which may distinguish this new, institutionalized form of algorithmic enforcement from known content filtering tools that have evolved as voluntary measures in the private sector. The DSM Directive underlines the necessity to safeguard user freedoms that support transformative, creative remixes and mash-ups of pre-existing content. This feature of the new legislation may offer important incentives to develop algorithmic tools that go beyond the mere identification of unauthorized takings from protected works. It has the potential to encourage content assessment mechanisms that factor the degree of transformative effort and user creativity into the equation. As a result, more balanced content filtering tools may emerge in the EU. Against this background, the analysis shows that the new EU legislation not only escalates the use of algorithmic enforcement measures that already commenced in the private sector years ago. If rightly implemented, it may also add an important nuance to existing content identification tools and alleviate the problems arising from reliance on automated filtering mechanisms.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Against Imperial Arbitrators: The Brilliance of Canada's New Model Investment Treaty "COVID-19 Was the Publicist for Homeschooling" and States Need to Finally Take Homeschooling Regulations Seriously Post-Pandemic Second Annual Report to The Editor-In-Chief Gender Inequality in Contracts Casebooks: Representations of Women in the Contracts Curriculum You'll Grow Into It: How Federal and State Courts Have Erred in Excluding Persons Under Twenty-One from 'the people' Protected by the Second Amendment
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1