通过裁决规范算法歧视:欧盟法院关于基于PNR数据的算法侧写中的歧视问题

IF 2.3 Q1 INTERNATIONAL RELATIONS Frontiers in Political Science Pub Date : 2023-10-16 DOI:10.3389/fpos.2023.1232601
Lucas Michael Haitsma
{"title":"通过裁决规范算法歧视:欧盟法院关于基于PNR数据的算法侧写中的歧视问题","authors":"Lucas Michael Haitsma","doi":"10.3389/fpos.2023.1232601","DOIUrl":null,"url":null,"abstract":"This article considers the Court of Justice of the European Union's assessment and regulation of risks of discrimination in the context of algorithmic profiling based on Passenger Name Records data (PNR data). On the June 21, 2022 the court delivered a landmark judgment in Ligue des Droits Humains pertaining to discrimination and algorithmic profiling in a border security context. The CJEU identifies and seeks to regulate several risks of discrimination in relation to the automated processing of PNR data, the manual review of the results of this processing, and the resulting decisions taken by competent authorities. It interpreted whether the PNR Directive that lays down the legal basis for such profiling was compatible with the fundamental right to privacy, the right to data protection, and the right to non-discrimination. In its judgment, the CJEU seems to insufficiently assess various risks of discrimination. In particular, it overlooks risks relating to data quality and representativeness, automation bias, and practical difficulties in identifying discrimination. The judges also seem to prescribe safeguards against discrimination without guidance as to how to ensure their uniform and effective implementation. Such shortcomings can be observed in relation to ensuring the non-discriminatory nature of law enforcement databases, preventing indirectly discriminatory profiling practices based on collected PNR data, and configuring effective human-in-the-loop and transparency safeguards. This landmark judgement represents an important step in addressing algorithmic discrimination through CJEU adjudication. However, the CJEUs inability to sufficiently address the risks of discrimination in the context of algorithmic profiling based on the PNR Directive raises a broader concern. Namely, whether the CJEU is adequately equipped to combat algorithmic discrimination in the broader realm of European border security where algorithmic profiling is becoming increasingly commonplace.","PeriodicalId":34431,"journal":{"name":"Frontiers in Political Science","volume":"19 1","pages":"0"},"PeriodicalIF":2.3000,"publicationDate":"2023-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Regulating algorithmic discrimination through adjudication: the Court of Justice of the European Union on discrimination in algorithmic profiling based on PNR data\",\"authors\":\"Lucas Michael Haitsma\",\"doi\":\"10.3389/fpos.2023.1232601\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This article considers the Court of Justice of the European Union's assessment and regulation of risks of discrimination in the context of algorithmic profiling based on Passenger Name Records data (PNR data). On the June 21, 2022 the court delivered a landmark judgment in Ligue des Droits Humains pertaining to discrimination and algorithmic profiling in a border security context. The CJEU identifies and seeks to regulate several risks of discrimination in relation to the automated processing of PNR data, the manual review of the results of this processing, and the resulting decisions taken by competent authorities. It interpreted whether the PNR Directive that lays down the legal basis for such profiling was compatible with the fundamental right to privacy, the right to data protection, and the right to non-discrimination. In its judgment, the CJEU seems to insufficiently assess various risks of discrimination. In particular, it overlooks risks relating to data quality and representativeness, automation bias, and practical difficulties in identifying discrimination. The judges also seem to prescribe safeguards against discrimination without guidance as to how to ensure their uniform and effective implementation. Such shortcomings can be observed in relation to ensuring the non-discriminatory nature of law enforcement databases, preventing indirectly discriminatory profiling practices based on collected PNR data, and configuring effective human-in-the-loop and transparency safeguards. This landmark judgement represents an important step in addressing algorithmic discrimination through CJEU adjudication. However, the CJEUs inability to sufficiently address the risks of discrimination in the context of algorithmic profiling based on the PNR Directive raises a broader concern. Namely, whether the CJEU is adequately equipped to combat algorithmic discrimination in the broader realm of European border security where algorithmic profiling is becoming increasingly commonplace.\",\"PeriodicalId\":34431,\"journal\":{\"name\":\"Frontiers in Political Science\",\"volume\":\"19 1\",\"pages\":\"0\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2023-10-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in Political Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3389/fpos.2023.1232601\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"INTERNATIONAL RELATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Political Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/fpos.2023.1232601","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"INTERNATIONAL RELATIONS","Score":null,"Total":0}
引用次数: 0

摘要

本文考虑了欧盟法院在基于乘客姓名记录数据(PNR数据)的算法分析背景下对歧视风险的评估和监管。2022年6月21日,法院在法国人权法院就边境安全背景下的歧视和算法侧写作出了具有里程碑意义的判决。欧洲法院查明并设法管制与自动处理PNR数据、人工审查这一处理结果以及主管当局所作决定有关的几种歧视风险。它解释了为这种分析奠定法律基础的PNR指令是否符合隐私权、数据保护权和不歧视权的基本权利。在其判决中,欧洲法院似乎没有充分评估各种歧视的风险。特别是,它忽略了与数据质量和代表性、自动化偏差以及识别歧视的实际困难有关的风险。法官似乎也规定了防止歧视的保障措施,但没有指导如何确保这些保障措施的统一和有效执行。在确保执法数据库的非歧视性、防止基于收集的PNR数据的间接歧视性分析做法以及配置有效的人在环路和透明度保障措施方面,可以观察到这些缺陷。这一具有里程碑意义的判决是通过法院裁决解决算法歧视问题的重要一步。然而,欧洲法院无法充分解决基于PNR指令的算法侧写背景下的歧视风险,这引起了更广泛的关注。也就是说,欧洲法院是否有足够的能力在更广泛的欧洲边境安全领域打击算法歧视,因为算法侧写正变得越来越普遍。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Regulating algorithmic discrimination through adjudication: the Court of Justice of the European Union on discrimination in algorithmic profiling based on PNR data
This article considers the Court of Justice of the European Union's assessment and regulation of risks of discrimination in the context of algorithmic profiling based on Passenger Name Records data (PNR data). On the June 21, 2022 the court delivered a landmark judgment in Ligue des Droits Humains pertaining to discrimination and algorithmic profiling in a border security context. The CJEU identifies and seeks to regulate several risks of discrimination in relation to the automated processing of PNR data, the manual review of the results of this processing, and the resulting decisions taken by competent authorities. It interpreted whether the PNR Directive that lays down the legal basis for such profiling was compatible with the fundamental right to privacy, the right to data protection, and the right to non-discrimination. In its judgment, the CJEU seems to insufficiently assess various risks of discrimination. In particular, it overlooks risks relating to data quality and representativeness, automation bias, and practical difficulties in identifying discrimination. The judges also seem to prescribe safeguards against discrimination without guidance as to how to ensure their uniform and effective implementation. Such shortcomings can be observed in relation to ensuring the non-discriminatory nature of law enforcement databases, preventing indirectly discriminatory profiling practices based on collected PNR data, and configuring effective human-in-the-loop and transparency safeguards. This landmark judgement represents an important step in addressing algorithmic discrimination through CJEU adjudication. However, the CJEUs inability to sufficiently address the risks of discrimination in the context of algorithmic profiling based on the PNR Directive raises a broader concern. Namely, whether the CJEU is adequately equipped to combat algorithmic discrimination in the broader realm of European border security where algorithmic profiling is becoming increasingly commonplace.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Frontiers in Political Science
Frontiers in Political Science Social Sciences-Political Science and International Relations
CiteScore
2.90
自引率
0.00%
发文量
135
审稿时长
13 weeks
期刊最新文献
Human involvement in autonomous decision-making systems. Lessons learned from three case studies in aviation, social care and road vehicles Deciphering the maritime diplomatic properties of Malaysia's oil and gas explorations in the South China Sea Dimensions of cultural sustainability—Local adaptation, adaptive capacity and social resilience Neurorights as reconceptualized human rights Interactions among national and supranational identities: mobilizing the independence movement in Scotland
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1