Pub Date : 2021-11-10DOI: 10.1093/oxfordhb/9780198857815.013.16
R. Sparrow
In an influential essay, published in the 1980s, philosopher of technology Langdon Winner (1980), asked, ‘Do artefacts have politics?’ His answer, confirmed by subsequent decades of science and technology studies, was a resounding ‘Yes!’ Artefacts have political choices embedded in their design and entrench these politics in their applications. Moreover, because technologies are better suited to serving some ends rather than others, artefacts shape the societies in which they are developed by shaping the circumstances of their own use. This chapter explores how robots have politics and how those politics are relevant to their ethics. It suggests that, for a number of reasons, robots have more politics than do other sorts of artefacts.
{"title":"How Robots Have Politics","authors":"R. Sparrow","doi":"10.1093/oxfordhb/9780198857815.013.16","DOIUrl":"https://doi.org/10.1093/oxfordhb/9780198857815.013.16","url":null,"abstract":"In an influential essay, published in the 1980s, philosopher of technology Langdon Winner (1980), asked, ‘Do artefacts have politics?’ His answer, confirmed by subsequent decades of science and technology studies, was a resounding ‘Yes!’ Artefacts have political choices embedded in their design and entrench these politics in their applications. Moreover, because technologies are better suited to serving some ends rather than others, artefacts shape the societies in which they are developed by shaping the circumstances of their own use. This chapter explores how robots have politics and how those politics are relevant to their ethics. It suggests that, for a number of reasons, robots have more politics than do other sorts of artefacts.","PeriodicalId":262957,"journal":{"name":"The Oxford Handbook of Digital Ethics","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128466834","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-10DOI: 10.1093/oxfordhb/9780198857815.013.13
A. Sterri, B. Earp
What, if anything, is wrong with having sex with a robot? For the sake of this chapter, we will assume that sexbots are ‘mere’ machines that are reliably identifiable as such, despite their human-like appearance and behaviour. Under these stipulations, sexbots themselves can no more be harmed, morally speaking, than your dishwasher. However, there may still be something wrong about the production, distribution, and use of such sexbots. In this chapter, we examine whether sex with robots is intrinsically or instrumentally wrong and critically assess different regulatory responses. They defend a harm reduction approach to sexbot regulation, analogous to the approach that has been considered in other areas, concerning, for example, drugs and sex work.
{"title":"The Ethics of Sex Robots","authors":"A. Sterri, B. Earp","doi":"10.1093/oxfordhb/9780198857815.013.13","DOIUrl":"https://doi.org/10.1093/oxfordhb/9780198857815.013.13","url":null,"abstract":"What, if anything, is wrong with having sex with a robot? For the sake of this chapter, we will assume that sexbots are ‘mere’ machines that are reliably identifiable as such, despite their human-like appearance and behaviour. Under these stipulations, sexbots themselves can no more be harmed, morally speaking, than your dishwasher. However, there may still be something wrong about the production, distribution, and use of such sexbots. In this chapter, we examine whether sex with robots is intrinsically or instrumentally wrong and critically assess different regulatory responses. They defend a harm reduction approach to sexbot regulation, analogous to the approach that has been considered in other areas, concerning, for example, drugs and sex work.","PeriodicalId":262957,"journal":{"name":"The Oxford Handbook of Digital Ethics","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116085095","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-10DOI: 10.1093/oxfordhb/9780198857815.013.24
K. Lippert‐Rasmussen, Lauritz Aastrup Munch
This chapter discusses the morality of online price discrimination. Price discrimination is a widespread type of market behaviour and it occurs, roughly, when a seller systematically charges different prices for the same product when it is offered to different groups of customers. Price discrimination occurs both online and offline, but some find the practice particularly suspicious when deployed in online markets. This asymmetry, we argue, calls for an explanation. In the chapter, we define price discrimination and review a number of explanations of why, and when, price discrimination is morally objectionable. We argue that online price discrimination will often prove more problematic than its offline counterpart, but also that neither practice is necessarily morally wrong.
{"title":"Price Discrimination in the Digital Age","authors":"K. Lippert‐Rasmussen, Lauritz Aastrup Munch","doi":"10.1093/oxfordhb/9780198857815.013.24","DOIUrl":"https://doi.org/10.1093/oxfordhb/9780198857815.013.24","url":null,"abstract":"This chapter discusses the morality of online price discrimination. Price discrimination is a widespread type of market behaviour and it occurs, roughly, when a seller systematically charges different prices for the same product when it is offered to different groups of customers. Price discrimination occurs both online and offline, but some find the practice particularly suspicious when deployed in online markets. This asymmetry, we argue, calls for an explanation. In the chapter, we define price discrimination and review a number of explanations of why, and when, price discrimination is morally objectionable. We argue that online price discrimination will often prove more problematic than its offline counterpart, but also that neither practice is necessarily morally wrong.","PeriodicalId":262957,"journal":{"name":"The Oxford Handbook of Digital Ethics","volume":"182 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122232971","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-10DOI: 10.1093/oxfordhb/9780198857815.013.9
H. Lawford-Smith, J. Megarry
Women, particularly those who work in public positions (such as journalists, politicians, celebrities, activists) are subject to disproportionate amounts of abuse on social media platforms such as Twitter. This abuse occurs in a landscape that those platforms designed and that they maintain. Focusing in particular on Twitter, as typical of the kind of platform we are interested in, this chapter argues that it is the platform, not (usually) the individuals who use it, that bears collective responsibility as a corporate agent for misogyny. Social media platforms, however, should not overstep the prevention of misogyny into interference in open political debates.
{"title":"Is There Collective Responsibility for Misogyny Perpetrated on Social Media?","authors":"H. Lawford-Smith, J. Megarry","doi":"10.1093/oxfordhb/9780198857815.013.9","DOIUrl":"https://doi.org/10.1093/oxfordhb/9780198857815.013.9","url":null,"abstract":"Women, particularly those who work in public positions (such as journalists, politicians, celebrities, activists) are subject to disproportionate amounts of abuse on social media platforms such as Twitter. This abuse occurs in a landscape that those platforms designed and that they maintain. Focusing in particular on Twitter, as typical of the kind of platform we are interested in, this chapter argues that it is the platform, not (usually) the individuals who use it, that bears collective responsibility as a corporate agent for misogyny. Social media platforms, however, should not overstep the prevention of misogyny into interference in open political debates.","PeriodicalId":262957,"journal":{"name":"The Oxford Handbook of Digital Ethics","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124753542","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-10DOI: 10.1093/oxfordhb/9780198857815.013.21
Lisa Herzog
The chapter discusses the problem of algorithmic bias in decision-making processes that determine access to opportunities, such as recidivism scores, college admission decisions, or loan scores. After describing the technical bases of algorithmic bias, it asks how to evaluate them, drawing on Iris Marion Young’s perspective of structural (in)justice. The focus is in particular on the risk of so-called ‘Matthew effects’, in which privileged individuals gain more advantages, while those who are already disadvantaged suffer further. Some proposed solutions are discussed, with an emphasis on the need to take a broad, interdisciplinary perspective rather than a purely technical perspective. The chapter also replies to the objection that private firms cannot be held responsible for addressing structural injustices and concludes by emphasizing the need for political and social action.
本章讨论了在决定获得机会的决策过程中的算法偏差问题,例如累犯分数、大学录取决定或贷款分数。在描述了算法偏见的技术基础之后,它询问了如何评估它们,并借鉴了Iris Marion Young的结构正义(in)观点。研究的重点是所谓的“马太效应”的风险,即享有特权的人获得更多的优势,而那些已经处于不利地位的人则遭受更大的损失。讨论了一些建议的解决办法,重点是需要采取广泛的跨学科观点,而不是纯粹的技术观点。本章还答复了反对意见,即私营公司不能对解决结构性不公正负责,并在结束时强调需要采取政治和社会行动。
{"title":"Algorithmic Bias and Access to Opportunities","authors":"Lisa Herzog","doi":"10.1093/oxfordhb/9780198857815.013.21","DOIUrl":"https://doi.org/10.1093/oxfordhb/9780198857815.013.21","url":null,"abstract":"The chapter discusses the problem of algorithmic bias in decision-making processes that determine access to opportunities, such as recidivism scores, college admission decisions, or loan scores. After describing the technical bases of algorithmic bias, it asks how to evaluate them, drawing on Iris Marion Young’s perspective of structural (in)justice. The focus is in particular on the risk of so-called ‘Matthew effects’, in which privileged individuals gain more advantages, while those who are already disadvantaged suffer further. Some proposed solutions are discussed, with an emphasis on the need to take a broad, interdisciplinary perspective rather than a purely technical perspective. The chapter also replies to the objection that private firms cannot be held responsible for addressing structural injustices and concludes by emphasizing the need for political and social action.","PeriodicalId":262957,"journal":{"name":"The Oxford Handbook of Digital Ethics","volume":"47 8","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131878867","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}