Mackenzie Jorgensen;Madeleine Waller;Oana Cocarascu;Natalia Criado;Odinaldo Rodrigues;Jose Such;Elizabeth Black
{"title":"Investigating the Legality of Bias Mitigation Methods in the United Kingdom","authors":"Mackenzie Jorgensen;Madeleine Waller;Oana Cocarascu;Natalia Criado;Odinaldo Rodrigues;Jose Such;Elizabeth Black","doi":"10.1109/MTS.2023.3341465","DOIUrl":null,"url":null,"abstract":"Algorithmic Decision-Making Systems (ADMS) \n<xref>1</xref>\n fairness issues have been well highlighted over the past decade \n<xref>[1]</xref>\n, including some facial recognition systems struggling to identify people of color \n<xref>[2]</xref>\n. In 2021, Uber drivers filed a claim with the U.K. ’s employment tribunal for unfair dismissal resulting from automated facial recognition technology by Microsoft \n<xref>[3]</xref>\n. \n<italic>Bias mitigation methods</i>\n have been developed to reduce discrimination from ADMS. These typically operationalize fairness notions as \n<italic>fairness metrics</i>\n to minimize discrimination \n<xref>[4]</xref>\n. We refer to ADMS to which bias mitigation methods have been applied as “mitigated ADMS” or, in the singular, a “mitigated system.”","PeriodicalId":55016,"journal":{"name":"IEEE Technology and Society Magazine","volume":"42 4","pages":"87-94"},"PeriodicalIF":2.1000,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Technology and Society Magazine","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10410096/","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Algorithmic Decision-Making Systems (ADMS)
1
fairness issues have been well highlighted over the past decade
[1]
, including some facial recognition systems struggling to identify people of color
[2]
. In 2021, Uber drivers filed a claim with the U.K. ’s employment tribunal for unfair dismissal resulting from automated facial recognition technology by Microsoft
[3]
.
Bias mitigation methods
have been developed to reduce discrimination from ADMS. These typically operationalize fairness notions as
fairness metrics
to minimize discrimination
[4]
. We refer to ADMS to which bias mitigation methods have been applied as “mitigated ADMS” or, in the singular, a “mitigated system.”
期刊介绍:
IEEE Technology and Society Magazine invites feature articles (refereed), special articles, and commentaries on topics within the scope of the IEEE Society on Social Implications of Technology, in the broad areas of social implications of electrotechnology, history of electrotechnology, and engineering ethics.