Pub Date : 2023-12-01DOI: 10.1017/S0892679423000321
Arun Seraphin, Wilson Miles
Abstract The development of new technologies that enable autonomous weapon systems poses a challenge to policymakers and technologists trying to balance military requirements with international obligations and ethical norms. Some have called for new international agreements to restrict or ban lethal autonomous weapon systems. Given the tactical and strategic value of the technologies and the proliferation of threats, the military continues to explore the development of new autonomous technologies to execute national security missions. The rapid global diffusion and dual-use nature of autonomous systems necessitate a proactive approach and a shared understanding of the technical realities, threats, military relevance, and strategic implications of these technologies from these communities. Ultimately, developing AI-enabled defense systems that adhere to global norms and relevant treaty obligations, leverage emerging technologies, and provide operational advantages is possible. The development of a workable and realistic regulatory framework governing the use of lethal autonomous weapons and the artificial intelligence that underpins autonomy will be best supported through a coordinated effort of the regulatory community, technologists, and military to create requirements that reflect the global proliferation and rapidly evolving threat of autonomous weapon systems. This essay seeks to demonstrate that: (1) the lack of coherent dialogue between the technical and policy communities can create security, ethical, and legal dilemmas; and (2) bridging the military, technical, and policy communities can lead to technology with constraints that balance the needs of military, technical, and policy communities. It uses case studies to show why mechanisms are needed to enable early and continuous engagement across the technical, policymaking, and operational communities. The essay then uses twelve interviews with AI and autonomy experts, which provide insight into what the technical and policymaking communities consider fundamental to the progression of responsible autonomous development. It also recommends practical steps for connecting the relevant stakeholders. The goal is to provide the Department of Defense with concrete steps for building organizational structures or processes that create incentives for engagement across communities.
{"title":"Toward a Balanced Approach: Bridging the Military, Policy, and Technical Communities","authors":"Arun Seraphin, Wilson Miles","doi":"10.1017/S0892679423000321","DOIUrl":"https://doi.org/10.1017/S0892679423000321","url":null,"abstract":"Abstract The development of new technologies that enable autonomous weapon systems poses a challenge to policymakers and technologists trying to balance military requirements with international obligations and ethical norms. Some have called for new international agreements to restrict or ban lethal autonomous weapon systems. Given the tactical and strategic value of the technologies and the proliferation of threats, the military continues to explore the development of new autonomous technologies to execute national security missions. The rapid global diffusion and dual-use nature of autonomous systems necessitate a proactive approach and a shared understanding of the technical realities, threats, military relevance, and strategic implications of these technologies from these communities. Ultimately, developing AI-enabled defense systems that adhere to global norms and relevant treaty obligations, leverage emerging technologies, and provide operational advantages is possible. The development of a workable and realistic regulatory framework governing the use of lethal autonomous weapons and the artificial intelligence that underpins autonomy will be best supported through a coordinated effort of the regulatory community, technologists, and military to create requirements that reflect the global proliferation and rapidly evolving threat of autonomous weapon systems. This essay seeks to demonstrate that: (1) the lack of coherent dialogue between the technical and policy communities can create security, ethical, and legal dilemmas; and (2) bridging the military, technical, and policy communities can lead to technology with constraints that balance the needs of military, technical, and policy communities. It uses case studies to show why mechanisms are needed to enable early and continuous engagement across the technical, policymaking, and operational communities. The essay then uses twelve interviews with AI and autonomy experts, which provide insight into what the technical and policymaking communities consider fundamental to the progression of responsible autonomous development. It also recommends practical steps for connecting the relevant stakeholders. The goal is to provide the Department of Defense with concrete steps for building organizational structures or processes that create incentives for engagement across communities.","PeriodicalId":11772,"journal":{"name":"Ethics & International Affairs","volume":" September","pages":"272 - 286"},"PeriodicalIF":1.3,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138610958","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-01DOI: 10.1017/S0892679423000369
Lonneke Peperkamp
{"title":"How to End a War: Essays on Justice, Peace, and Repair, Graham Parsons and Mark A. Wilson, eds. (Cambridge, U.K.: Cambridge University Press, 2023), 207 pp., cloth $110, eBook $110.","authors":"Lonneke Peperkamp","doi":"10.1017/S0892679423000369","DOIUrl":"https://doi.org/10.1017/S0892679423000369","url":null,"abstract":"","PeriodicalId":11772,"journal":{"name":"Ethics & International Affairs","volume":" 98","pages":"362 - 364"},"PeriodicalIF":1.3,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138612096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-01DOI: 10.1017/S0892679423000357
Mary Ellen O'Connell
Abstract ChatGPT launched in November 2022, triggering a global debate on the use of artificial intelligence (AI). A debate on AI-enabled lethal autonomous weapon systems (LAWS) has been underway far longer. Two sides have emerged: one in favor and one opposed to an international law ban on LAWS. This essay explains the position of advocates of a ban without attempting to persuade opponents. Supporters of a ban believe LAWS are already unlawful and immoral to use without the need of a new treaty or protocol. They nevertheless seek an express prohibition to educate and publicize the threats these weapons pose. Foremost among their concerns is the “black box” problem. Programmers cannot know what a computer operating a weapons system empowered with AI will “learn” from the algorithm they use. They cannot know at the time of deployment if the system will comply with the prohibition on the use of force or the human right to life that applies in both war and peace. Even if they could, mechanized killing affronts human dignity. Ban supporters have long known that “AI models are not safe and no one knows how to reliably make them safe” or morally acceptable in taking human life.
{"title":"Banning Autonomous Weapons: A Legal and Ethical Mandate","authors":"Mary Ellen O'Connell","doi":"10.1017/S0892679423000357","DOIUrl":"https://doi.org/10.1017/S0892679423000357","url":null,"abstract":"Abstract ChatGPT launched in November 2022, triggering a global debate on the use of artificial intelligence (AI). A debate on AI-enabled lethal autonomous weapon systems (LAWS) has been underway far longer. Two sides have emerged: one in favor and one opposed to an international law ban on LAWS. This essay explains the position of advocates of a ban without attempting to persuade opponents. Supporters of a ban believe LAWS are already unlawful and immoral to use without the need of a new treaty or protocol. They nevertheless seek an express prohibition to educate and publicize the threats these weapons pose. Foremost among their concerns is the “black box” problem. Programmers cannot know what a computer operating a weapons system empowered with AI will “learn” from the algorithm they use. They cannot know at the time of deployment if the system will comply with the prohibition on the use of force or the human right to life that applies in both war and peace. Even if they could, mechanized killing affronts human dignity. Ban supporters have long known that “AI models are not safe and no one knows how to reliably make them safe” or morally acceptable in taking human life.","PeriodicalId":11772,"journal":{"name":"Ethics & International Affairs","volume":"15 s2","pages":"287 - 298"},"PeriodicalIF":1.3,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138627444","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-01DOI: 10.1017/S0892679423000308
Esther D. Reed
Abstract Accountability for developing, deploying, and using any emerging weapons system is affirmed as a guiding principle by the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems. Yet advances in emerging technologies present accountability challenges throughout the life cycle of a weapons system. Mindful of a lack of progress at the Convention on Certain Conventional Weapons since 2019, this essay argues for a mechanism capable of imputing accountability when individual agent accountability is exceeded, forensic accountability unreliable, and aspects of political accountability fail.
{"title":"Accountability for the Taking of Human Life with LAWS in War","authors":"Esther D. Reed","doi":"10.1017/S0892679423000308","DOIUrl":"https://doi.org/10.1017/S0892679423000308","url":null,"abstract":"Abstract Accountability for developing, deploying, and using any emerging weapons system is affirmed as a guiding principle by the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems. Yet advances in emerging technologies present accountability challenges throughout the life cycle of a weapons system. Mindful of a lack of progress at the Convention on Certain Conventional Weapons since 2019, this essay argues for a mechanism capable of imputing accountability when individual agent accountability is exceeded, forensic accountability unreliable, and aspects of political accountability fail.","PeriodicalId":11772,"journal":{"name":"Ethics & International Affairs","volume":"122 7","pages":"299 - 308"},"PeriodicalIF":1.3,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138608133","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-01DOI: 10.1017/S0892679423000254
Dominic Lenzi
Abstract The urgency of climate change has never been greater, nor the moral case for responding to it more compelling. This review essay critically compares Darrel Moellendorf's Mobilizing Hope and Catriona McKinnon's Climate Change and Political Theory. Moellendorf's book defends the moral importance of poverty alleviation through sustainable economic growth and argues for a mass climate movement based on the promise of a more prosperous future. By contrast, McKinnon provides a political vocabulary to articulate the many faces of climate injustice, and to critically examine proposed policy solutions—notably including the indefinite pursuit of economic growth. While both find reasons to be hopeful, their wide-ranging accounts reflect different visions of what a just and sustainable future might look like. They reflect different understandings of sustainable development and the significance of environmental values; the scope of permissible climate activism; and the ethics of geoengineering. Building upon them, I argue in favor of a more pluralistic vision of a just climate future, one that is capable of speaking to the range of moral interests bearing upon the climate and biodiversity crises, and that supports sustainable development that is inclusive of diverse human-nature relationships.
{"title":"Hope, Pessimism, and the Shape of a Just Climate Future","authors":"Dominic Lenzi","doi":"10.1017/S0892679423000254","DOIUrl":"https://doi.org/10.1017/S0892679423000254","url":null,"abstract":"Abstract The urgency of climate change has never been greater, nor the moral case for responding to it more compelling. This review essay critically compares Darrel Moellendorf's Mobilizing Hope and Catriona McKinnon's Climate Change and Political Theory. Moellendorf's book defends the moral importance of poverty alleviation through sustainable economic growth and argues for a mass climate movement based on the promise of a more prosperous future. By contrast, McKinnon provides a political vocabulary to articulate the many faces of climate injustice, and to critically examine proposed policy solutions—notably including the indefinite pursuit of economic growth. While both find reasons to be hopeful, their wide-ranging accounts reflect different visions of what a just and sustainable future might look like. They reflect different understandings of sustainable development and the significance of environmental values; the scope of permissible climate activism; and the ethics of geoengineering. Building upon them, I argue in favor of a more pluralistic vision of a just climate future, one that is capable of speaking to the range of moral interests bearing upon the climate and biodiversity crises, and that supports sustainable development that is inclusive of diverse human-nature relationships.","PeriodicalId":11772,"journal":{"name":"Ethics & International Affairs","volume":" 7","pages":"344 - 361"},"PeriodicalIF":1.3,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138612945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-01DOI: 10.1017/S089267942300031X
Anthony F. Lang
Abstract Regulating war has long been a concern of the international community. From the Hague Conventions to the Geneva Conventions and the multiple treaties and related institutions that have emerged in the twentieth and twenty-first centuries, efforts to mitigate the horrors of war have focused on regulating weapons, defining combatants, and ensuring access to the battlefield for humanitarians. But regulation and legal codes alone cannot be the end point of an engaged ethical response to new weapons developments. This short essay reviews some of the existing ethical works on lethal autonomous weapon systems (LAWS), highlighting how rule- and consequence-based accounts fail to provide adequate guidance for how to deal with them. I propose a virtue-based account, which I link up with an Aristotelian framework, for how the international community might better address these weapons systems.
{"title":"Regulating Weapons: An Aristotelian Account","authors":"Anthony F. Lang","doi":"10.1017/S089267942300031X","DOIUrl":"https://doi.org/10.1017/S089267942300031X","url":null,"abstract":"Abstract Regulating war has long been a concern of the international community. From the Hague Conventions to the Geneva Conventions and the multiple treaties and related institutions that have emerged in the twentieth and twenty-first centuries, efforts to mitigate the horrors of war have focused on regulating weapons, defining combatants, and ensuring access to the battlefield for humanitarians. But regulation and legal codes alone cannot be the end point of an engaged ethical response to new weapons developments. This short essay reviews some of the existing ethical works on lethal autonomous weapon systems (LAWS), highlighting how rule- and consequence-based accounts fail to provide adequate guidance for how to deal with them. I propose a virtue-based account, which I link up with an Aristotelian framework, for how the international community might better address these weapons systems.","PeriodicalId":11772,"journal":{"name":"Ethics & International Affairs","volume":" 10","pages":"309 - 320"},"PeriodicalIF":1.3,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138620701","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-01DOI: 10.1017/s089267942300028x
Lauren Sukin
{"title":"The Hegemon's Tool Kit: US Leadership and the Politics of the Nuclear Nonproliferation Regime, Rebecca Davis Gibbons (Ithaca, N.Y.: Cornell University Press, 2022), 240 pp., cloth $49.95, eBook $32.99.","authors":"Lauren Sukin","doi":"10.1017/s089267942300028x","DOIUrl":"https://doi.org/10.1017/s089267942300028x","url":null,"abstract":"","PeriodicalId":11772,"journal":{"name":"Ethics & International Affairs","volume":" 45","pages":"364 - 366"},"PeriodicalIF":1.3,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138612551","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-01DOI: 10.1017/S0892679423000278
Timothy M. Peterson
{"title":"Backfire: How Sanctions Reshape the World Against U.S. Interests, Agathe Demarais (New York: Columbia University Press, 2022) 304 pp., cloth $30, eBook $29.99.","authors":"Timothy M. Peterson","doi":"10.1017/S0892679423000278","DOIUrl":"https://doi.org/10.1017/S0892679423000278","url":null,"abstract":"","PeriodicalId":11772,"journal":{"name":"Ethics & International Affairs","volume":" 48","pages":"366 - 369"},"PeriodicalIF":1.3,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138612480","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-01DOI: 10.1017/S0892679423000291
Neil Renic, Elke Schwarz
Abstract Systematic killing has long been associated with some of the darkest episodes in human history. Increasingly, however, it is framed as a desirable outcome in war, particularly in the context of military AI and lethal autonomy. Autonomous weapons systems, defenders argue, will surpass humans not only militarily but also morally, enabling a more precise and dispassionate mode of violence, free of the emotion and uncertainty that too often weaken compliance with the rules and standards of war. We contest this framing. Drawing on the history of systematic killing, we argue that lethal autonomous weapons systems reproduce, and in some cases intensify, the moral challenges of the past. Autonomous violence incentivizes a moral devaluation of those targeted and erodes the moral agency of those who kill. Both outcomes imperil essential restraints on the use of military force.
{"title":"Crimes of Dispassion: Autonomous Weapons and the Moral Challenge of Systematic Killing","authors":"Neil Renic, Elke Schwarz","doi":"10.1017/S0892679423000291","DOIUrl":"https://doi.org/10.1017/S0892679423000291","url":null,"abstract":"Abstract Systematic killing has long been associated with some of the darkest episodes in human history. Increasingly, however, it is framed as a desirable outcome in war, particularly in the context of military AI and lethal autonomy. Autonomous weapons systems, defenders argue, will surpass humans not only militarily but also morally, enabling a more precise and dispassionate mode of violence, free of the emotion and uncertainty that too often weaken compliance with the rules and standards of war. We contest this framing. Drawing on the history of systematic killing, we argue that lethal autonomous weapons systems reproduce, and in some cases intensify, the moral challenges of the past. Autonomous violence incentivizes a moral devaluation of those targeted and erodes the moral agency of those who kill. Both outcomes imperil essential restraints on the use of military force.","PeriodicalId":11772,"journal":{"name":"Ethics & International Affairs","volume":" 11","pages":"321 - 343"},"PeriodicalIF":1.3,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138618292","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-01DOI: 10.1017/S0892679423000266
David A. Deptula
Abstract Rapid technological change is resulting in the development of ever increasingly capable autonomous weapon systems. As they become more sophisticated, the calls for developing restrictions on their use, up to and including their complete prohibition, are growing. Not unlike the call for restrictions on the sale and use of drones, most proposed restrictions are well-intentioned but are often ill-informed, with a high likelihood of degrading national security and putting additional lives at risk. Employed by experienced operators well-versed in the laws of armed conflict, autonomous weapons can advance the objectives of those who would prohibit their use. This essay takes an operational perspective to examine the role that autonomous weapon systems can play while complying with the laws of armed conflict. With responsible design and incorporation of applicable control measures, autonomous weapons will be able not just to comply but also to enhance the ethical use of force. This essay contends that efforts by the international community to use international legal means and/or institutions to over-regulate or even ban lethal autonomous weapons are counterproductive. It considers and describes the end-game results of the use of autonomous weapons in enhancing the application of both international law and human ethical values.
{"title":"An Operational Perspective on the Ethics of the Use of Autonomous Weapons","authors":"David A. Deptula","doi":"10.1017/S0892679423000266","DOIUrl":"https://doi.org/10.1017/S0892679423000266","url":null,"abstract":"Abstract Rapid technological change is resulting in the development of ever increasingly capable autonomous weapon systems. As they become more sophisticated, the calls for developing restrictions on their use, up to and including their complete prohibition, are growing. Not unlike the call for restrictions on the sale and use of drones, most proposed restrictions are well-intentioned but are often ill-informed, with a high likelihood of degrading national security and putting additional lives at risk. Employed by experienced operators well-versed in the laws of armed conflict, autonomous weapons can advance the objectives of those who would prohibit their use. This essay takes an operational perspective to examine the role that autonomous weapon systems can play while complying with the laws of armed conflict. With responsible design and incorporation of applicable control measures, autonomous weapons will be able not just to comply but also to enhance the ethical use of force. This essay contends that efforts by the international community to use international legal means and/or institutions to over-regulate or even ban lethal autonomous weapons are counterproductive. It considers and describes the end-game results of the use of autonomous weapons in enhancing the application of both international law and human ethical values.","PeriodicalId":11772,"journal":{"name":"Ethics & International Affairs","volume":" 39","pages":"261 - 271"},"PeriodicalIF":1.3,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138619380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}