{"title":"Autonomous Weapon Systems, Errors and Breaches of International Humanitarian Law","authors":"Abhimanyu George Jain","doi":"10.1093/jicj/mqad043","DOIUrl":null,"url":null,"abstract":"Abstract An error in the operation of an autonomous weapon system (AWS) results in civilians or civilian objects being attacked. In such situations, have civilians or civilian objects been ‘made the object of attack’, such that there is a breach of the rule prohibiting attacks against civilians or civilian objects? This question — which is important because of the high probability of such errors — forms the subject of this article. It argues that the rule prohibiting attacks against civilians or civilian objects requires due diligence — contextually reasonable efforts — across the targeting process, to ensure that civilians or civilian objects are not attacked. This implies that AWS errors breach this rule if the errors are unreasonable, i.e., if they originate in a failure of due diligence at any point in the process of development and deployment of AWS. Moreover, the risk-sensitivity of due diligence obligations suggests that the higher degree of risk involved in the development and use of an AWS leads to a corresponding increase in what constitutes contextually reasonable efforts to ensure that civilians or civilian objects are not attacked.","PeriodicalId":46732,"journal":{"name":"Journal of International Criminal Justice","volume":"17 1","pages":"0"},"PeriodicalIF":1.5000,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of International Criminal Justice","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/jicj/mqad043","RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"LAW","Score":null,"Total":0}
引用次数: 0
Abstract
Abstract An error in the operation of an autonomous weapon system (AWS) results in civilians or civilian objects being attacked. In such situations, have civilians or civilian objects been ‘made the object of attack’, such that there is a breach of the rule prohibiting attacks against civilians or civilian objects? This question — which is important because of the high probability of such errors — forms the subject of this article. It argues that the rule prohibiting attacks against civilians or civilian objects requires due diligence — contextually reasonable efforts — across the targeting process, to ensure that civilians or civilian objects are not attacked. This implies that AWS errors breach this rule if the errors are unreasonable, i.e., if they originate in a failure of due diligence at any point in the process of development and deployment of AWS. Moreover, the risk-sensitivity of due diligence obligations suggests that the higher degree of risk involved in the development and use of an AWS leads to a corresponding increase in what constitutes contextually reasonable efforts to ensure that civilians or civilian objects are not attacked.
期刊介绍:
The Journal of International Criminal Justice aims to promote a profound collective reflection on the new problems facing international law. Established by a group of distinguished criminal lawyers and international lawyers, the Journal addresses the major problems of justice from the angle of law, jurisprudence, criminology, penal philosophy, and the history of international judicial institutions. It is intended for graduate and post-graduate students, practitioners, academics, government officials, as well as the hundreds of people working for international criminal courts.