使用致命自主武器系统

Q3 Arts and Humanities International Journal of Applied Philosophy Pub Date : 2021-03-27 DOI:10.5840/IJAP2021326145
M. Häyry
{"title":"使用致命自主武器系统","authors":"M. Häyry","doi":"10.5840/IJAP2021326145","DOIUrl":null,"url":null,"abstract":"The ethics of warfare and military leadership must pay attention to the rapidly increasing use of artificial intelligence and machines. Who is responsible for the decisions made by a machine? Do machines make decisions? May they make them? These issues are of particular interest in the context of Lethal Autonomous Weapon Systems (LAWS). Are they autonomous or just automated? Do they violate the international humanitarian law which requires that humans must always be responsible for the use of lethal force and for the assessment that civilian casualties are proportionate to the military goals? The article analyses relevant documents, opinions, government positions, and commentaries using the methods of applied ethics. The main conceptual finding is that the definition of autonomy depends on what the one presenting it seeks to support. Those who want to use lethal autonomous weapon systems call them by another name, say, automated instead of autonomous. They impose standards on autonomy that machines do not meet, such as moral agency. Those who wish to ban the use of lethal autonomous weapon systems define them much less broadly and do not require them to do much more than to be a self-standing part of the causal chain.The article’s argument is that the question of responsibility is most naturally perceived by abandoning the most controversial philosophical considerations and simply stating that an individual or a group of people is always responsible for the creation of the equipment they produce and use. This does not mean that those who press the button, or their immediate superiors, are to blame. They are doing their jobs in a system. The ones responsible can probably be found in higher military leadership, in political decision-makers who dictate their goals, and, at least in democracies, in the citizens who have chosen their political decision-makers.","PeriodicalId":35847,"journal":{"name":"International Journal of Applied Philosophy","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2021-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Employing Lethal Autonomous Weapon Systems\",\"authors\":\"M. Häyry\",\"doi\":\"10.5840/IJAP2021326145\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The ethics of warfare and military leadership must pay attention to the rapidly increasing use of artificial intelligence and machines. Who is responsible for the decisions made by a machine? Do machines make decisions? May they make them? These issues are of particular interest in the context of Lethal Autonomous Weapon Systems (LAWS). Are they autonomous or just automated? Do they violate the international humanitarian law which requires that humans must always be responsible for the use of lethal force and for the assessment that civilian casualties are proportionate to the military goals? The article analyses relevant documents, opinions, government positions, and commentaries using the methods of applied ethics. The main conceptual finding is that the definition of autonomy depends on what the one presenting it seeks to support. Those who want to use lethal autonomous weapon systems call them by another name, say, automated instead of autonomous. They impose standards on autonomy that machines do not meet, such as moral agency. Those who wish to ban the use of lethal autonomous weapon systems define them much less broadly and do not require them to do much more than to be a self-standing part of the causal chain.The article’s argument is that the question of responsibility is most naturally perceived by abandoning the most controversial philosophical considerations and simply stating that an individual or a group of people is always responsible for the creation of the equipment they produce and use. This does not mean that those who press the button, or their immediate superiors, are to blame. They are doing their jobs in a system. The ones responsible can probably be found in higher military leadership, in political decision-makers who dictate their goals, and, at least in democracies, in the citizens who have chosen their political decision-makers.\",\"PeriodicalId\":35847,\"journal\":{\"name\":\"International Journal of Applied Philosophy\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-03-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Applied Philosophy\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5840/IJAP2021326145\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"Arts and Humanities\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Applied Philosophy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5840/IJAP2021326145","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Arts and Humanities","Score":null,"Total":0}
引用次数: 1

摘要

战争伦理和军事领导必须关注人工智能和机器的快速增长。谁对机器做出的决定负责?机器做决定吗?他们能做吗?在致命自主武器系统(LAWS)的背景下,这些问题特别令人感兴趣。它们是自主的还是自动化的?它们是否违反了国际人道主义法,该法要求人类必须始终对使用致命武力负责,并对平民伤亡与军事目标相称的评估负责?本文运用应用伦理学的方法对相关文献、意见、政府立场和评论进行了分析。主要的概念发现是,自治的定义取决于提出自治的人寻求支持什么。那些想要使用致命的自主武器系统的人用另一个名字来称呼它们,比如说,自动化而不是自主。他们对机器不符合的自主性强加了标准,比如道德代理。那些希望禁止使用致命自主武器系统的人对其定义要宽泛得多,并且不要求它们做更多的事情,而只是成为因果链中自立的一部分。这篇文章的论点是,责任问题是最自然的,因为它放弃了最具争议的哲学考虑,简单地说,个人或一群人总是对他们生产和使用的设备的制造负责。这并不意味着按下按钮的人或他们的直接上级应该受到指责。他们在一个系统中工作。负有责任的人可能存在于更高的军事领导层、决定目标的政治决策者中,至少在民主国家,也存在于选择政治决策者的公民中。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Employing Lethal Autonomous Weapon Systems
The ethics of warfare and military leadership must pay attention to the rapidly increasing use of artificial intelligence and machines. Who is responsible for the decisions made by a machine? Do machines make decisions? May they make them? These issues are of particular interest in the context of Lethal Autonomous Weapon Systems (LAWS). Are they autonomous or just automated? Do they violate the international humanitarian law which requires that humans must always be responsible for the use of lethal force and for the assessment that civilian casualties are proportionate to the military goals? The article analyses relevant documents, opinions, government positions, and commentaries using the methods of applied ethics. The main conceptual finding is that the definition of autonomy depends on what the one presenting it seeks to support. Those who want to use lethal autonomous weapon systems call them by another name, say, automated instead of autonomous. They impose standards on autonomy that machines do not meet, such as moral agency. Those who wish to ban the use of lethal autonomous weapon systems define them much less broadly and do not require them to do much more than to be a self-standing part of the causal chain.The article’s argument is that the question of responsibility is most naturally perceived by abandoning the most controversial philosophical considerations and simply stating that an individual or a group of people is always responsible for the creation of the equipment they produce and use. This does not mean that those who press the button, or their immediate superiors, are to blame. They are doing their jobs in a system. The ones responsible can probably be found in higher military leadership, in political decision-makers who dictate their goals, and, at least in democracies, in the citizens who have chosen their political decision-makers.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
International Journal of Applied Philosophy
International Journal of Applied Philosophy Arts and Humanities-Philosophy
CiteScore
0.30
自引率
0.00%
发文量
8
期刊最新文献
The Ethics of Cultivated Meat in advance Proportionality in Self-Defense in advance Public Support of Sectarian Education in advance Trolley Problem Applied in advance Moral Machines in advance
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1