有偏见的数据在计算机性别歧视中的作用

IF 2 Q3 MANAGEMENT Equality Diversity and Inclusion Pub Date : 2022-05-01 DOI:10.1145/3524501.3527599
Md. Arshad Ahmed, Madhur Chatterjee, Pankaj Dadure, Partha Pakray
{"title":"有偏见的数据在计算机性别歧视中的作用","authors":"Md. Arshad Ahmed, Madhur Chatterjee, Pankaj Dadure, Partha Pakray","doi":"10.1145/3524501.3527599","DOIUrl":null,"url":null,"abstract":"Gender bias is prevalent in all walks of life from schools to colleges, corporate as well as government offices. This has led to the under-representation of the female gender in many professions. Most of the Artificial Intelligence-Natural Language Processing (AI-NLP) models learning from these underrepresented real world datasets amplify the bias in many cases, resulting in traditional biases being reinforced. In this paper, we have discussed how gender bias became ingrained in our society and how it results in the underrepresentation of the female gender in several fields such as education, healthcare, STEM, film industry, food industry, and sports. We shed some light on how traditional gender bias is reflected in AI-NLP systems such as automated resume screening, machine translation, text generation, etc. Future prospects of these AI-NLP applications need to include possible solutions to these existing biased AI-NLP applications, such as debiasing the word embeddings and having guidelines for more ethical and transparent standards. ACM Reference Format: Md. Arshad Ahmed, Madhura Chatterjee, Pankaj Dadure, and Partha Pakray. 2022. The Role of Biased Data in Computerized Gender Discrimination. In Third Workshop on Gender Equality, Diversity, and Inclusion in Software Engineering (GE@ICSE’22), May 20, 2022, Pittsburgh, PA, USA. ACM, New York, NY, USA, 6 pages. https://doi.org/10.1145/3524501.3527599","PeriodicalId":46962,"journal":{"name":"Equality Diversity and Inclusion","volume":"129 1","pages":"6-11"},"PeriodicalIF":2.0000,"publicationDate":"2022-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"The Role of Biased Data in Computerized Gender Discrimination\",\"authors\":\"Md. Arshad Ahmed, Madhur Chatterjee, Pankaj Dadure, Partha Pakray\",\"doi\":\"10.1145/3524501.3527599\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Gender bias is prevalent in all walks of life from schools to colleges, corporate as well as government offices. This has led to the under-representation of the female gender in many professions. Most of the Artificial Intelligence-Natural Language Processing (AI-NLP) models learning from these underrepresented real world datasets amplify the bias in many cases, resulting in traditional biases being reinforced. In this paper, we have discussed how gender bias became ingrained in our society and how it results in the underrepresentation of the female gender in several fields such as education, healthcare, STEM, film industry, food industry, and sports. We shed some light on how traditional gender bias is reflected in AI-NLP systems such as automated resume screening, machine translation, text generation, etc. Future prospects of these AI-NLP applications need to include possible solutions to these existing biased AI-NLP applications, such as debiasing the word embeddings and having guidelines for more ethical and transparent standards. ACM Reference Format: Md. Arshad Ahmed, Madhura Chatterjee, Pankaj Dadure, and Partha Pakray. 2022. The Role of Biased Data in Computerized Gender Discrimination. In Third Workshop on Gender Equality, Diversity, and Inclusion in Software Engineering (GE@ICSE’22), May 20, 2022, Pittsburgh, PA, USA. ACM, New York, NY, USA, 6 pages. https://doi.org/10.1145/3524501.3527599\",\"PeriodicalId\":46962,\"journal\":{\"name\":\"Equality Diversity and Inclusion\",\"volume\":\"129 1\",\"pages\":\"6-11\"},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2022-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Equality Diversity and Inclusion\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3524501.3527599\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MANAGEMENT\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Equality Diversity and Inclusion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3524501.3527599","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MANAGEMENT","Score":null,"Total":0}
引用次数: 5

摘要

从学校到大学,从企业到政府部门,性别偏见在各行各业都很普遍。这导致了女性在许多职业中的代表性不足。大多数人工智能-自然语言处理(AI-NLP)模型从这些代表性不足的现实世界数据集中学习,在许多情况下都会放大偏见,导致传统偏见被强化。在本文中,我们讨论了性别偏见如何在我们的社会中根深蒂固,以及它如何导致女性在教育、医疗保健、STEM、电影工业、食品工业和体育等几个领域的代表性不足。我们揭示了传统的性别偏见如何反映在AI-NLP系统中,如自动简历筛选、机器翻译、文本生成等。这些AI-NLP应用的未来前景需要包括对这些现有有偏见的AI-NLP应用的可能解决方案,例如消除词嵌入的偏见,并制定更道德和透明的标准指导方针。ACM参考格式:Md. Arshad Ahmed, Madhura Chatterjee, Pankaj Dadure和Partha Pakray。2022。有偏见的数据在计算机性别歧视中的作用。在性别平等,多样性和包容性软件工程(GE@ICSE ' 22)的第三次研讨会,2022年5月20日,匹兹堡,宾夕法尼亚州,美国。ACM,纽约,美国,6页。https://doi.org/10.1145/3524501.3527599
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
The Role of Biased Data in Computerized Gender Discrimination
Gender bias is prevalent in all walks of life from schools to colleges, corporate as well as government offices. This has led to the under-representation of the female gender in many professions. Most of the Artificial Intelligence-Natural Language Processing (AI-NLP) models learning from these underrepresented real world datasets amplify the bias in many cases, resulting in traditional biases being reinforced. In this paper, we have discussed how gender bias became ingrained in our society and how it results in the underrepresentation of the female gender in several fields such as education, healthcare, STEM, film industry, food industry, and sports. We shed some light on how traditional gender bias is reflected in AI-NLP systems such as automated resume screening, machine translation, text generation, etc. Future prospects of these AI-NLP applications need to include possible solutions to these existing biased AI-NLP applications, such as debiasing the word embeddings and having guidelines for more ethical and transparent standards. ACM Reference Format: Md. Arshad Ahmed, Madhura Chatterjee, Pankaj Dadure, and Partha Pakray. 2022. The Role of Biased Data in Computerized Gender Discrimination. In Third Workshop on Gender Equality, Diversity, and Inclusion in Software Engineering (GE@ICSE’22), May 20, 2022, Pittsburgh, PA, USA. ACM, New York, NY, USA, 6 pages. https://doi.org/10.1145/3524501.3527599
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
4.50
自引率
8.30%
发文量
50
期刊最新文献
The Social Drivers of Inclusive Workplaces scale: a preliminary validation of the questionnaire Hope theory as resistance: narratives of South Asian scholars in Australian academia Coping techniques and strategies for pursuing anti-racism within academe: a collective autoethnographic account from minoritised academics in the UK Covering Número 85: a content analysis and critical race theory perspective Addressing the challenge of engaging in paid work while undertaking unpaid caring: insights for improving employment inclusion of young carers
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1