{"title":"人/机器(学习)互动、人的能动性和国际人道法比例标准","authors":"Taylor Kate Woodcock","doi":"10.1080/13600826.2023.2267592","DOIUrl":null,"url":null,"abstract":"Developments in machine learning prompt questions about algorithmic decision-support systems (DSS) in warfare. This article explores how the use of these technologies impact practices of legal reasoning in military targeting. International Humanitarian Law (IHL) requires assessment of the proportionality of attacks, namely whether the expected incidental harm to civilians and civilian objects is excessive compared to the anticipated military advantage. Situating human agency in this practice of legal reasoning, this article considers whether the interaction between commanders (and the teams that support them) and algorithmic DSS for proportionality assessments alter this practice and displace the exercise of human agency. As DSS that purport to provide recommendations on proportionality generate output in a manner substantively different to proportionality assessments, these systems are not fit for purpose. Moreover, legal reasoning may be shaped by DSS that provide intelligence information due to the limits of reliability, biases and opacity characteristic of machine learning.","PeriodicalId":46197,"journal":{"name":"Global Society","volume":"10 1","pages":"0"},"PeriodicalIF":1.7000,"publicationDate":"2023-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Human/Machine(-Learning) Interactions, Human Agency and the International Humanitarian Law Proportionality Standard\",\"authors\":\"Taylor Kate Woodcock\",\"doi\":\"10.1080/13600826.2023.2267592\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Developments in machine learning prompt questions about algorithmic decision-support systems (DSS) in warfare. This article explores how the use of these technologies impact practices of legal reasoning in military targeting. International Humanitarian Law (IHL) requires assessment of the proportionality of attacks, namely whether the expected incidental harm to civilians and civilian objects is excessive compared to the anticipated military advantage. Situating human agency in this practice of legal reasoning, this article considers whether the interaction between commanders (and the teams that support them) and algorithmic DSS for proportionality assessments alter this practice and displace the exercise of human agency. As DSS that purport to provide recommendations on proportionality generate output in a manner substantively different to proportionality assessments, these systems are not fit for purpose. Moreover, legal reasoning may be shaped by DSS that provide intelligence information due to the limits of reliability, biases and opacity characteristic of machine learning.\",\"PeriodicalId\":46197,\"journal\":{\"name\":\"Global Society\",\"volume\":\"10 1\",\"pages\":\"0\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2023-11-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Global Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/13600826.2023.2267592\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"INTERNATIONAL RELATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Global Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/13600826.2023.2267592","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"INTERNATIONAL RELATIONS","Score":null,"Total":0}
Human/Machine(-Learning) Interactions, Human Agency and the International Humanitarian Law Proportionality Standard
Developments in machine learning prompt questions about algorithmic decision-support systems (DSS) in warfare. This article explores how the use of these technologies impact practices of legal reasoning in military targeting. International Humanitarian Law (IHL) requires assessment of the proportionality of attacks, namely whether the expected incidental harm to civilians and civilian objects is excessive compared to the anticipated military advantage. Situating human agency in this practice of legal reasoning, this article considers whether the interaction between commanders (and the teams that support them) and algorithmic DSS for proportionality assessments alter this practice and displace the exercise of human agency. As DSS that purport to provide recommendations on proportionality generate output in a manner substantively different to proportionality assessments, these systems are not fit for purpose. Moreover, legal reasoning may be shaped by DSS that provide intelligence information due to the limits of reliability, biases and opacity characteristic of machine learning.
期刊介绍:
Global Society covers the new agenda in global and international relations and encourages innovative approaches to the study of global and international issues from a range of disciplines. It promotes the analysis of transactions at multiple levels, and in particular, the way in which these transactions blur the distinction between the sub-national, national, transnational, international and global levels. An ever integrating global society raises a number of issues for global and international relations which do not fit comfortably within established "Paradigms" Among these are the international and global consequences of nationalism and struggles for identity, migration, racism, religious fundamentalism, terrorism and criminal activities.