{"title":"The Ethics of 'Deathbots'.","authors":"Nora Freya Lindemann","doi":"10.1007/s11948-022-00417-x","DOIUrl":null,"url":null,"abstract":"<p><p>Recent developments in AI programming allow for new applications: individualized chatbots which mimic the speaking and writing behaviour of one specific living or dead person. 'Deathbots', chatbots of the dead, have already been implemented and are currently under development by the first start-up companies. Thus, it is an urgent issue to consider the ethical implications of deathbots. While previous ethical theories of deathbots have always been based on considerations of the dignity of the deceased, I propose to shift the focus on the dignity and autonomy of the bereaved users of deathbots. Drawing on theories of internet-scaffolded affectivity and on theories of grief, I argue that deathbots may have a negative impact on the grief process of bereaved users and therefore have the potential to limit the emotional and psychological wellbeing of their users. Deathbot users are likely to become dependent on their bots which may make them susceptible to surreptitious advertising by deathbot providing companies and may limit their autonomy. At the same time, deathbots may prove to be helpful for people who suffer from prolonged, severe grief processes. I caution against the unrestricted usage of deathbots and suggest that they should be classified as medical devices. This classification would not the least mean that their non-harm, as well as their helpfulness for people suffering from prolonged grief needs to be proven and that their potential for autonomy infringements is reduced.</p>","PeriodicalId":49564,"journal":{"name":"Science and Engineering Ethics","volume":null,"pages":null},"PeriodicalIF":2.7000,"publicationDate":"2022-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9684218/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Science and Engineering Ethics","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1007/s11948-022-00417-x","RegionNum":2,"RegionCategory":"哲学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Recent developments in AI programming allow for new applications: individualized chatbots which mimic the speaking and writing behaviour of one specific living or dead person. 'Deathbots', chatbots of the dead, have already been implemented and are currently under development by the first start-up companies. Thus, it is an urgent issue to consider the ethical implications of deathbots. While previous ethical theories of deathbots have always been based on considerations of the dignity of the deceased, I propose to shift the focus on the dignity and autonomy of the bereaved users of deathbots. Drawing on theories of internet-scaffolded affectivity and on theories of grief, I argue that deathbots may have a negative impact on the grief process of bereaved users and therefore have the potential to limit the emotional and psychological wellbeing of their users. Deathbot users are likely to become dependent on their bots which may make them susceptible to surreptitious advertising by deathbot providing companies and may limit their autonomy. At the same time, deathbots may prove to be helpful for people who suffer from prolonged, severe grief processes. I caution against the unrestricted usage of deathbots and suggest that they should be classified as medical devices. This classification would not the least mean that their non-harm, as well as their helpfulness for people suffering from prolonged grief needs to be proven and that their potential for autonomy infringements is reduced.
期刊介绍:
Science and Engineering Ethics is an international multidisciplinary journal dedicated to exploring ethical issues associated with science and engineering, covering professional education, research and practice as well as the effects of technological innovations and research findings on society.
While the focus of this journal is on science and engineering, contributions from a broad range of disciplines, including social sciences and humanities, are welcomed. Areas of interest include, but are not limited to, ethics of new and emerging technologies, research ethics, computer ethics, energy ethics, animals and human subjects ethics, ethics education in science and engineering, ethics in design, biomedical ethics, values in technology and innovation.
We welcome contributions that deal with these issues from an international perspective, particularly from countries that are underrepresented in these discussions.