文本挖掘偏见:推荐信实验

IF 1.3 3区 社会学 Q3 BUSINESS American Business Law Journal Pub Date : 2022-04-06 DOI:10.1111/ablj.12198
Charlotte S. Alexander
{"title":"文本挖掘偏见:推荐信实验","authors":"Charlotte S. Alexander","doi":"10.1111/ablj.12198","DOIUrl":null,"url":null,"abstract":"<p>This article uses computational text analysis to study the form and content of more than 3000 recommendation letters submitted on behalf of applicants to a major U.S. anesthesiology residency program. The article finds small differences in form and larger differences in content. Women applicants' letters were more likely to contain references to acts of service, for example, whereas men were more likely to be described in terms of their professionalism and technical skills. Some differences persisted when controlling for standardized aptitude test scores, on which women and men scored equally on average, and other applicant and letter-writer characteristics. Even when all explicit gender-identifying language was stripped from the letters, a machine learning algorithm was able to predict applicant gender at a rate better than chance. Gender stereotyped language in recommendation letters may infect the entirety of an employer's hiring or selection process, implicating Title VII of the Civil Rights Act of 1964. Not all gendered language differences were large, however, suggesting that small changes may remedy the problem. The article closes by proposing a computationally driven system that may help employers identify and eradicate bias, while also prompting a rethinking of our gendered, racialized, ableist, ageist, and otherwise stereotyped occupational archetypes.</p>","PeriodicalId":54186,"journal":{"name":"American Business Law Journal","volume":"59 1","pages":"5-59"},"PeriodicalIF":1.3000,"publicationDate":"2022-04-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Text Mining for Bias: A Recommendation Letter Experiment\",\"authors\":\"Charlotte S. Alexander\",\"doi\":\"10.1111/ablj.12198\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>This article uses computational text analysis to study the form and content of more than 3000 recommendation letters submitted on behalf of applicants to a major U.S. anesthesiology residency program. The article finds small differences in form and larger differences in content. Women applicants' letters were more likely to contain references to acts of service, for example, whereas men were more likely to be described in terms of their professionalism and technical skills. Some differences persisted when controlling for standardized aptitude test scores, on which women and men scored equally on average, and other applicant and letter-writer characteristics. Even when all explicit gender-identifying language was stripped from the letters, a machine learning algorithm was able to predict applicant gender at a rate better than chance. Gender stereotyped language in recommendation letters may infect the entirety of an employer's hiring or selection process, implicating Title VII of the Civil Rights Act of 1964. Not all gendered language differences were large, however, suggesting that small changes may remedy the problem. The article closes by proposing a computationally driven system that may help employers identify and eradicate bias, while also prompting a rethinking of our gendered, racialized, ableist, ageist, and otherwise stereotyped occupational archetypes.</p>\",\"PeriodicalId\":54186,\"journal\":{\"name\":\"American Business Law Journal\",\"volume\":\"59 1\",\"pages\":\"5-59\"},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2022-04-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"American Business Law Journal\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1111/ablj.12198\",\"RegionNum\":3,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"BUSINESS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"American Business Law Journal","FirstCategoryId":"90","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/ablj.12198","RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"BUSINESS","Score":null,"Total":0}
引用次数: 1

摘要

本文使用计算文本分析来研究代表申请人提交给美国主要麻醉学住院医师项目的3000多封推荐信的形式和内容。这篇文章发现形式上的小差别,内容上的大差别。例如,女性求职者的求职信中更有可能提到自己的服务行为,而男性求职者的求职信则更有可能描述自己的专业精神和技术技能。在控制标准化能力倾向测试分数(男女平均得分相等)以及其他申请人和写信者的特征时,一些差异仍然存在。即使从信件中删除了所有明确的性别识别语言,机器学习算法也能够以高于偶然的速度预测申请人的性别。推荐信中的性别刻板印象可能会影响到雇主的整个招聘或选拔过程,这涉及到1964年《民权法案》第七章。然而,并非所有的性别语言差异都很大,这表明小的改变可能会解决这个问题。文章最后提出了一个计算驱动的系统,可以帮助雇主识别和消除偏见,同时也促使我们重新思考我们的性别、种族、体力歧视、年龄歧视和其他刻板的职业原型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Text Mining for Bias: A Recommendation Letter Experiment

This article uses computational text analysis to study the form and content of more than 3000 recommendation letters submitted on behalf of applicants to a major U.S. anesthesiology residency program. The article finds small differences in form and larger differences in content. Women applicants' letters were more likely to contain references to acts of service, for example, whereas men were more likely to be described in terms of their professionalism and technical skills. Some differences persisted when controlling for standardized aptitude test scores, on which women and men scored equally on average, and other applicant and letter-writer characteristics. Even when all explicit gender-identifying language was stripped from the letters, a machine learning algorithm was able to predict applicant gender at a rate better than chance. Gender stereotyped language in recommendation letters may infect the entirety of an employer's hiring or selection process, implicating Title VII of the Civil Rights Act of 1964. Not all gendered language differences were large, however, suggesting that small changes may remedy the problem. The article closes by proposing a computationally driven system that may help employers identify and eradicate bias, while also prompting a rethinking of our gendered, racialized, ableist, ageist, and otherwise stereotyped occupational archetypes.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
1.10
自引率
16.70%
发文量
17
期刊介绍: The ABLJ is a faculty-edited, double blind peer reviewed journal, continuously published since 1963. Our mission is to publish only top quality law review articles that make a scholarly contribution to all areas of law that impact business theory and practice. We search for those articles that articulate a novel research question and make a meaningful contribution directly relevant to scholars and practitioners of business law. The blind peer review process means legal scholars well-versed in the relevant specialty area have determined selected articles are original, thorough, important, and timely. Faculty editors assure the authors’ contribution to scholarship is evident. We aim to elevate legal scholarship and inform responsible business decisions.
期刊最新文献
Issue Information Rebooting the Community Reinvestment Act High-status versus low-status stakeholders Innovation stakeholders: Developing a sustainable paradigm to integrate intellectual property and corporate social responsibility Issue Information
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1