自主武器的风险:以残疾人权利为中心的分析

IF 0.6 4区 社会学 Q2 LAW International Review of the Red Cross Pub Date : 2022-11-07 DOI:10.1017/S1816383122000881
Mariana Díaz Figueroa, Anderson Henao Orozco, Jesús Martínez, Wanda Muñoz Jaime
{"title":"自主武器的风险:以残疾人权利为中心的分析","authors":"Mariana Díaz Figueroa, Anderson Henao Orozco, Jesús Martínez, Wanda Muñoz Jaime","doi":"10.1017/S1816383122000881","DOIUrl":null,"url":null,"abstract":"Abstract Autonomous weapons systems have been the subject of heated debate since 2010, when Philip Alston, then Special Rapporteur on Extrajudicial, Summary, or Arbitrary Executions, brought the issue to the international spotlight in his interim report to the United Nations (UN) General Assembly 65th Session. Alston affirmed that “automated technologies are becoming increasingly sophisticated, and artificial intelligence reasoning and decision-making abilities are actively being researched and receive significant funding. States’ militaries and defence industry developers are working to develop ‘fully autonomous capability’, such that technological advances in artificial intelligence will enable unmanned aerial vehicles to make and execute complex decisions, including the identification of human targets and the ability to kill them.”1 Later, in 2013, Christof Heyns, who was Special Rapporteur for Extrajudicial, Summary or Arbitrary Executions at the time, published a report that elaborated further on the issues raised by what he called “lethal autonomous robotics”.2 Following a recommendation by Advisory Board on Disarmament Matters at the UN General Assembly 68th Session, the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, as amended on 21 December 2021, started discussing autonomous weapons systems in 2014. Then, the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (GGE on LAWS)3 was created in 2016 to focus on this issue.4 While the group has kept meeting since then, no clear steps have been taken yet towards a normative framework on autonomous weapons as of September 2022. In all these years, persons with disabilities – including conflict survivors – have not been included in discussions, nor has the disability perspective been reflected in international debate on autonomous weapons. Only recently has there been any effort to consider the rights of persons with disabilities when examining ethical questions related to artificial intelligence (AI). In this article, we will examine how and why autonomous weapons have a disproportionate impact on persons with disabilities, because of the discrimination that results from a combination of factors such as bias in AI, bias in the military and the police, barriers to justice and humanitarian assistance in situations of armed conflict, and the lack of consultation and participation of persons with disabilities and their representative organizations on issues related to autonomy in weapons systems.","PeriodicalId":46925,"journal":{"name":"International Review of the Red Cross","volume":"33 1","pages":"278 - 305"},"PeriodicalIF":0.6000,"publicationDate":"2022-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The risks of autonomous weapons: An analysis centred on the rights of persons with disabilities\",\"authors\":\"Mariana Díaz Figueroa, Anderson Henao Orozco, Jesús Martínez, Wanda Muñoz Jaime\",\"doi\":\"10.1017/S1816383122000881\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Autonomous weapons systems have been the subject of heated debate since 2010, when Philip Alston, then Special Rapporteur on Extrajudicial, Summary, or Arbitrary Executions, brought the issue to the international spotlight in his interim report to the United Nations (UN) General Assembly 65th Session. Alston affirmed that “automated technologies are becoming increasingly sophisticated, and artificial intelligence reasoning and decision-making abilities are actively being researched and receive significant funding. States’ militaries and defence industry developers are working to develop ‘fully autonomous capability’, such that technological advances in artificial intelligence will enable unmanned aerial vehicles to make and execute complex decisions, including the identification of human targets and the ability to kill them.”1 Later, in 2013, Christof Heyns, who was Special Rapporteur for Extrajudicial, Summary or Arbitrary Executions at the time, published a report that elaborated further on the issues raised by what he called “lethal autonomous robotics”.2 Following a recommendation by Advisory Board on Disarmament Matters at the UN General Assembly 68th Session, the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, as amended on 21 December 2021, started discussing autonomous weapons systems in 2014. Then, the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (GGE on LAWS)3 was created in 2016 to focus on this issue.4 While the group has kept meeting since then, no clear steps have been taken yet towards a normative framework on autonomous weapons as of September 2022. In all these years, persons with disabilities – including conflict survivors – have not been included in discussions, nor has the disability perspective been reflected in international debate on autonomous weapons. Only recently has there been any effort to consider the rights of persons with disabilities when examining ethical questions related to artificial intelligence (AI). In this article, we will examine how and why autonomous weapons have a disproportionate impact on persons with disabilities, because of the discrimination that results from a combination of factors such as bias in AI, bias in the military and the police, barriers to justice and humanitarian assistance in situations of armed conflict, and the lack of consultation and participation of persons with disabilities and their representative organizations on issues related to autonomy in weapons systems.\",\"PeriodicalId\":46925,\"journal\":{\"name\":\"International Review of the Red Cross\",\"volume\":\"33 1\",\"pages\":\"278 - 305\"},\"PeriodicalIF\":0.6000,\"publicationDate\":\"2022-11-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Review of the Red Cross\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://doi.org/10.1017/S1816383122000881\",\"RegionNum\":4,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"LAW\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Review of the Red Cross","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1017/S1816383122000881","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"LAW","Score":null,"Total":0}
引用次数: 0

摘要

自2010年以来,自主武器系统一直是激烈辩论的主题,当时的法外处决、即审即决或任意处决问题特别报告员菲利普·奥尔斯顿(Philip Alston)在向联合国大会第65届会议提交的中期报告中将这一问题置于国际聚光灯下。阿尔斯通肯定地说:“自动化技术正变得越来越复杂,人工智能推理和决策能力正在得到积极研究,并获得了大量资金。各国的军事和国防工业开发商正在努力开发“完全自主能力”,这样人工智能的技术进步将使无人机能够做出和执行复杂的决策,包括识别人类目标和杀死他们的能力。1后来,2013年,时任法外处决、即审即决或任意处决问题特别报告员的克里斯托弗·海恩斯(Christof Heyns)发表了一份报告,进一步阐述了他所谓的“致命自主机器人”所引发的问题根据联合国大会第六十八届会议裁军事项咨询委员会的建议,经2021年12月21日修订的《禁止或限制使用某些可被认为具有过分伤害力或滥杀滥伤作用的常规武器公约》于2014年开始讨论自主武器系统。随后,致命自主武器系统领域新兴技术政府专家组(GGE on LAWS)于2016年成立,专注于这一问题自那以后,该组织一直在举行会议,但截至2022年9月,尚未采取任何明确步骤来制定自主武器的规范框架。这些年来,包括冲突幸存者在内的残疾人没有被纳入讨论,残疾人的观点也没有反映在关于自主武器的国际辩论中。直到最近,在审查与人工智能(AI)有关的伦理问题时,才开始考虑残疾人的权利。在本文中,我们将研究自主武器如何以及为什么对残疾人产生不成比例的影响,这是因为人工智能的偏见、军队和警察的偏见、武装冲突局势中司法和人道主义援助的障碍、以及残疾人及其代表组织在与武器系统自治相关的问题上缺乏协商和参与等因素综合造成的歧视。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
The risks of autonomous weapons: An analysis centred on the rights of persons with disabilities
Abstract Autonomous weapons systems have been the subject of heated debate since 2010, when Philip Alston, then Special Rapporteur on Extrajudicial, Summary, or Arbitrary Executions, brought the issue to the international spotlight in his interim report to the United Nations (UN) General Assembly 65th Session. Alston affirmed that “automated technologies are becoming increasingly sophisticated, and artificial intelligence reasoning and decision-making abilities are actively being researched and receive significant funding. States’ militaries and defence industry developers are working to develop ‘fully autonomous capability’, such that technological advances in artificial intelligence will enable unmanned aerial vehicles to make and execute complex decisions, including the identification of human targets and the ability to kill them.”1 Later, in 2013, Christof Heyns, who was Special Rapporteur for Extrajudicial, Summary or Arbitrary Executions at the time, published a report that elaborated further on the issues raised by what he called “lethal autonomous robotics”.2 Following a recommendation by Advisory Board on Disarmament Matters at the UN General Assembly 68th Session, the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, as amended on 21 December 2021, started discussing autonomous weapons systems in 2014. Then, the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (GGE on LAWS)3 was created in 2016 to focus on this issue.4 While the group has kept meeting since then, no clear steps have been taken yet towards a normative framework on autonomous weapons as of September 2022. In all these years, persons with disabilities – including conflict survivors – have not been included in discussions, nor has the disability perspective been reflected in international debate on autonomous weapons. Only recently has there been any effort to consider the rights of persons with disabilities when examining ethical questions related to artificial intelligence (AI). In this article, we will examine how and why autonomous weapons have a disproportionate impact on persons with disabilities, because of the discrimination that results from a combination of factors such as bias in AI, bias in the military and the police, barriers to justice and humanitarian assistance in situations of armed conflict, and the lack of consultation and participation of persons with disabilities and their representative organizations on issues related to autonomy in weapons systems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
1.10
自引率
28.60%
发文量
92
期刊最新文献
Interview with Nils Melzer: Director of the Department of Law, Policy and Humanitarian Diplomacy, International Committee of the Red Cross Navigating legal frontiers: Climate change, environmental protection and armed conflict Beyond retribution: Individual reparations for IHL violations as peace facilitators “When you have to shoot, shoot!” Rethinking the right to life of combatants during armed conflicts Of date palms and dialogue: Enhancing the protection of the natural environment under international humanitarian law and Islamic law
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1