The Accountability of Software Developers for War Crimes Involving Autonomous Weapons

IF 0.2 4区 社会学 Q4 LAW University of Pittsburgh Law Review Pub Date : 2021-10-06 DOI:10.5195/lawreview.2021.822
E. Winter
{"title":"The Accountability of Software Developers for War Crimes Involving Autonomous Weapons","authors":"E. Winter","doi":"10.5195/lawreview.2021.822","DOIUrl":null,"url":null,"abstract":"This Article considers the extent to which the joint criminal enterprise doctrine could be invoked to hold software developers criminally accountable for violations of international humanitarian law involving autonomous weapons. More specifically, it considers whether the third part of the concept—which concerns common criminal purposes—might be brought to bear to achieve this end. The doctrine is deconstructed into five components, and each component is analyzed both in abstract and in terms of practical application. The Article establishes that, in certain contexts, software developers can and should be held accountable through this mechanism. Thus, it demonstrates that it is possible to avoid the emergence of a “responsibility gap” if, or more likely when, autonomous weapons with offensive capabilities are finally deployed on the battlefield. * The author is a Lecturer (Assistant Professor) in International Law at Newcastle University Law School in the United Kingdom. U N I V E R S I T Y O F P I T T S B U R G H L A W R E V I E W P A G E | 5 2 | V O L . 8 3 | 2 0 2 1 ISSN 0041-9915 (print) 1942-8405 (online) ● DOI 10.5195/lawreview.2021.822 http://lawreview.law.pitt.edu INTRODUCTION The International Committee of the Red Cross (ICRC) defines an autonomous weapon as any weapon system with autonomy in its critical functions that can select and attack targets without human intervention.1 The extent to which the use of autonomous weapons might be compatible with substantive obligations in international humanitarian law (IHL) is a complex issue. The author has written previously on the intersection of these “killer robots” with key humanitarian law principles such as distinction,2 proportionality,3 and precaution.4 The present Article represents something of a departure because instead of considering whether the use of autonomous weapons would comply with the law, it focuses on how international criminal law secures individual accountability for violations of IHL involving such weapons. In other words, it considers potential criminal accountability where, for example, a machine targets a civilian, acts in a disproportionate manner, or fails to issue the appropriate warning. This issue is important because the value of any substantive legal rule is dependent, at least in part, on how amenable that rule is to enforcement. As the United Nations (UN) Special Rapporteur, Christof Heyns, noted: “Without the promise of accountability, deterrence and prevention are reduced, resulting in lower protection of civilians and potential victims of war crimes.”5 Thus, if there are no clear consequences for misusing autonomous weapons, individuals who wish to operate them may see this as a license to deploy machines that are not capable of complying with the law. The effect of this would be the deterioration of real-world protections for civilians. Of course, “robots have no moral agency” and cannot be 1 INT’L COMM. RED CROSS, AUTONOMOUS WEAPON SYS.: IMPLICATIONS OF INCREASING AUTONOMY IN THE CRITICAL FUNCTIONS OF WEAPONS 8 (2016), https://icrcndresourcecentre.org/wp-content/uploads/ 2017/11/4283_002_Autonomus-Weapon-Systems_WEB.pdf. 2 Elliot Winter, The Compatibility of Autonomous Weapons with the Principle of Distinction in the Law of Armed Conflict, 69 INT’L & COMPAR. L.Q. 845 (2020). 3 Elliot Winter, Autonomous Weapons in Humanitarian Law: Understanding the Technology, Its Compliance with the Principle of Proportionality and the Role of Utilitarianism, 6 GRONINGEN J. INT’L L. 183 (2018). 4 Elliot Winter, The Compatibility of the Use of Autonomous Weapons with the Principle of Precaution in the Law of Armed Conflict, 58 MIL. L. & L. WAR REV. 240 (2020). 5 Christof Heyns (Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions), Rep. on the Extrajudicial, Summary, or Arbitrary Executions, ¶ 75, U.N. Doc. A/HRC/23/47 (Apr. 9, 2013), https:// www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf [hereinafter Heyns]. T H E A C C O U N T A B I L I T Y O F S O F T W A R E D E V E L O P E R S","PeriodicalId":44686,"journal":{"name":"University of Pittsburgh Law Review","volume":" ","pages":""},"PeriodicalIF":0.2000,"publicationDate":"2021-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"University of Pittsburgh Law Review","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.5195/lawreview.2021.822","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"LAW","Score":null,"Total":0}
引用次数: 1

Abstract

This Article considers the extent to which the joint criminal enterprise doctrine could be invoked to hold software developers criminally accountable for violations of international humanitarian law involving autonomous weapons. More specifically, it considers whether the third part of the concept—which concerns common criminal purposes—might be brought to bear to achieve this end. The doctrine is deconstructed into five components, and each component is analyzed both in abstract and in terms of practical application. The Article establishes that, in certain contexts, software developers can and should be held accountable through this mechanism. Thus, it demonstrates that it is possible to avoid the emergence of a “responsibility gap” if, or more likely when, autonomous weapons with offensive capabilities are finally deployed on the battlefield. * The author is a Lecturer (Assistant Professor) in International Law at Newcastle University Law School in the United Kingdom. U N I V E R S I T Y O F P I T T S B U R G H L A W R E V I E W P A G E | 5 2 | V O L . 8 3 | 2 0 2 1 ISSN 0041-9915 (print) 1942-8405 (online) ● DOI 10.5195/lawreview.2021.822 http://lawreview.law.pitt.edu INTRODUCTION The International Committee of the Red Cross (ICRC) defines an autonomous weapon as any weapon system with autonomy in its critical functions that can select and attack targets without human intervention.1 The extent to which the use of autonomous weapons might be compatible with substantive obligations in international humanitarian law (IHL) is a complex issue. The author has written previously on the intersection of these “killer robots” with key humanitarian law principles such as distinction,2 proportionality,3 and precaution.4 The present Article represents something of a departure because instead of considering whether the use of autonomous weapons would comply with the law, it focuses on how international criminal law secures individual accountability for violations of IHL involving such weapons. In other words, it considers potential criminal accountability where, for example, a machine targets a civilian, acts in a disproportionate manner, or fails to issue the appropriate warning. This issue is important because the value of any substantive legal rule is dependent, at least in part, on how amenable that rule is to enforcement. As the United Nations (UN) Special Rapporteur, Christof Heyns, noted: “Without the promise of accountability, deterrence and prevention are reduced, resulting in lower protection of civilians and potential victims of war crimes.”5 Thus, if there are no clear consequences for misusing autonomous weapons, individuals who wish to operate them may see this as a license to deploy machines that are not capable of complying with the law. The effect of this would be the deterioration of real-world protections for civilians. Of course, “robots have no moral agency” and cannot be 1 INT’L COMM. RED CROSS, AUTONOMOUS WEAPON SYS.: IMPLICATIONS OF INCREASING AUTONOMY IN THE CRITICAL FUNCTIONS OF WEAPONS 8 (2016), https://icrcndresourcecentre.org/wp-content/uploads/ 2017/11/4283_002_Autonomus-Weapon-Systems_WEB.pdf. 2 Elliot Winter, The Compatibility of Autonomous Weapons with the Principle of Distinction in the Law of Armed Conflict, 69 INT’L & COMPAR. L.Q. 845 (2020). 3 Elliot Winter, Autonomous Weapons in Humanitarian Law: Understanding the Technology, Its Compliance with the Principle of Proportionality and the Role of Utilitarianism, 6 GRONINGEN J. INT’L L. 183 (2018). 4 Elliot Winter, The Compatibility of the Use of Autonomous Weapons with the Principle of Precaution in the Law of Armed Conflict, 58 MIL. L. & L. WAR REV. 240 (2020). 5 Christof Heyns (Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions), Rep. on the Extrajudicial, Summary, or Arbitrary Executions, ¶ 75, U.N. Doc. A/HRC/23/47 (Apr. 9, 2013), https:// www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf [hereinafter Heyns]. T H E A C C O U N T A B I L I T Y O F S O F T W A R E D E V E L O P E R S
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
软件开发者对涉及自主武器的战争罪行的问责
本文考虑了在何种程度上可以援引联合犯罪企业原则来追究软件开发人员对涉及自主武器的违反国际人道法行为的刑事责任。更具体地说,它考虑到这个概念的第三部分- -涉及共同犯罪目的- -是否可以用来实现这一目的。该学说被解构为五个组成部分,并从抽象和实际应用两个方面对每个组成部分进行分析。文章指出,在某些情况下,软件开发人员可以并且应该通过这种机制承担责任。因此,它表明,如果或更有可能的是,具有进攻能力的自主武器最终部署在战场上,就有可能避免出现“责任鸿沟”。*作者是英国纽卡斯尔大学法学院国际法讲师(助理教授)。你知道,我知道你是谁,我知道你是谁,我知道你是谁,我知道你是谁,我知道你是谁,我知道你是谁,我知道你是谁。●DOI 10.5195/lawreview.2021.822 http://lawreview.law.pitt.edu简介红十字国际委员会(ICRC)将自主武器定义为在关键功能上具有自主性,能够在没有人为干预的情况下选择和攻击目标的任何武器系统自主武器的使用在多大程度上可能符合国际人道主义法中的实质性义务是一个复杂的问题。作者之前写过关于这些“杀手机器人”与关键的人道法原则的交集,如区分、2比例、3和预防本条款代表了某种偏离,因为它没有考虑自主武器的使用是否符合法律规定,而是侧重于国际刑法如何确保涉及此类武器的违反国际人道法行为的个人责任。换句话说,它考虑到潜在的刑事责任,例如,一台机器以平民为目标,以不成比例的方式行事,或未能发出适当的警告。这个问题很重要,因为任何实质性法律规则的价值至少部分地取决于该规则是否易于执行。正如联合国特别报告员克里斯托夫·海恩斯(Christof Heyns)所指出的那样:“没有问责的承诺,威慑和预防就会减少,从而导致对平民和战争罪潜在受害者的保护降低。因此,如果滥用自主武器没有明确的后果,希望操作它们的人可能会将其视为部署不遵守法律的机器的许可证。这将导致现实世界对平民的保护恶化。当然,“机器人没有道德能动性”,也不可能是国际通信、红十字会、自主武器系统。:提高武器关键功能自主性的意义[j] (2016), https://icrcndresourcecentre.org/wp-content/uploads/ 2017/11/4283_002_autonomus - WEAPONS - systems_web .pdf。2艾略特·温特:《自主武器与武装冲突法区分原则的兼容性》,《国际比较》第69期。法律法规845(2020)。3魏略特。人道法中的自主武器:对技术的理解、比例原则的遵守与功利主义的作用,[j] .国际法学,2018(3)。4艾略特·温特:《自主武器使用与武装冲突法预防原则的兼容性》,《军法与军法》,2020年第4期。5克里斯托夫·海恩斯(法外处决、即审即决或任意处决问题特别报告员),法外处决、即审即决或任意处决问题代表,¶75,联合国文件。A/HRC/23/47(2013年4月9日),https:// www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf[以下简称海恩斯]。这是我最喜欢的东西,我最喜欢的东西,我最喜欢的东西,我最喜欢的东西,我最喜欢的东西,我最喜欢的东西
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
0.30
自引率
0.00%
发文量
20
期刊介绍: The Law Review is a student-run journal of legal scholarship that publishes quarterly. Our goal is to contribute to the legal community by featuring pertinent articles that highlight current legal issues and changes in the law. The Law Review publishes articles, comments, book reviews, and notes on a wide variety of topics, including constitutional law, securities regulation, criminal procedure, family law, international law, and jurisprudence. The Law Review has also hosted several symposia, bringing scholars into one setting for lively debate and discussion of key legal topics.
期刊最新文献
The Ninth Amendment: The "Hard Problem" of U.S. Constitutional Law Criminal Justice Technology and the Regulatory Sandbox: Toward Balancing Justice, Accountability, and Innovation From Past to Present: Funding the Pennsylvania Public Education System The Federal Courts Are Not Bias Free Zones: An Argument for Eliminating Diversity Jurisdiction Urgenda vs. Juliana: Lessons for Future Climate Change Litigation Cases
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1