调查算法风险和种族

Melissa Hamilton
{"title":"调查算法风险和种族","authors":"Melissa Hamilton","doi":"10.5070/cj85154807","DOIUrl":null,"url":null,"abstract":"Author(s): Hamilton, Melissa | Abstract: Risk assessment algorithms lie at the heart of criminal justice reform to tackle mass incarceration. The newest application of risk tools centers on the pretrial stage as a means to reduce both reliance upon wealth-based bail systems and rates of pretrial detention. Yet the ability of risk assessment to achieve the reform movement’s goals will be challengedif the risk tools do not perform equitably for minorities. To date, little is known about the racial fairness of these algorithms as they are used in the field. This Article offers an original empirical study of a popular risk assessment tool to evaluate its race-based performance. The case study is novel in employing a two-sample design with large datasets from diverse jurisdictions, one with a supermajority white population and the other a supermajority Black population.Statistical analyses examine whether, in these jurisdictions, the algorithmic risk tool results in disparate impact, exhibits test bias, or displays differential validity in terms of unequal performance metrics for white versus Black defendants. Implications of the study results are informative to the broader knowledge base about risk assessment practices in the field. Results contribute to the debate about the topic of algorithmic fairness in an important setting where one’s liberty interests may be infringed despite not being adjudicated guilty of any crime.","PeriodicalId":91042,"journal":{"name":"UCLA criminal justice law review","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Investigating Algorithmic Risk and Race\",\"authors\":\"Melissa Hamilton\",\"doi\":\"10.5070/cj85154807\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Author(s): Hamilton, Melissa | Abstract: Risk assessment algorithms lie at the heart of criminal justice reform to tackle mass incarceration. The newest application of risk tools centers on the pretrial stage as a means to reduce both reliance upon wealth-based bail systems and rates of pretrial detention. Yet the ability of risk assessment to achieve the reform movement’s goals will be challengedif the risk tools do not perform equitably for minorities. To date, little is known about the racial fairness of these algorithms as they are used in the field. This Article offers an original empirical study of a popular risk assessment tool to evaluate its race-based performance. The case study is novel in employing a two-sample design with large datasets from diverse jurisdictions, one with a supermajority white population and the other a supermajority Black population.Statistical analyses examine whether, in these jurisdictions, the algorithmic risk tool results in disparate impact, exhibits test bias, or displays differential validity in terms of unequal performance metrics for white versus Black defendants. Implications of the study results are informative to the broader knowledge base about risk assessment practices in the field. Results contribute to the debate about the topic of algorithmic fairness in an important setting where one’s liberty interests may be infringed despite not being adjudicated guilty of any crime.\",\"PeriodicalId\":91042,\"journal\":{\"name\":\"UCLA criminal justice law review\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"UCLA criminal justice law review\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5070/cj85154807\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"UCLA criminal justice law review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5070/cj85154807","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

摘要:风险评估算法是解决大规模监禁问题的刑事司法改革的核心。风险工具的最新应用集中在审前阶段,作为减少对基于财富的保释制度的依赖和审前拘留率的一种手段。然而,如果风险工具不能公平地为少数群体服务,那么实现改革运动目标的风险评估能力将受到挑战。到目前为止,人们对这些算法在该领域使用时的种族公平性知之甚少。本文对一种流行的风险评估工具进行了原始的实证研究,以评估其基于种族的绩效。该案例研究的新颖之处在于采用了来自不同司法管辖区的大型数据集的双样本设计,其中一个是白人人口占绝对多数,另一个是黑人人口占绝对多数。统计分析检查了在这些司法管辖区,算法风险工具是否会导致不同的影响,表现出测试偏差,或者在白人与黑人被告的不平等绩效指标方面显示出不同的有效性。研究结果的含义是信息的更广泛的知识库的风险评估实践在该领域。在一个重要的环境中,尽管没有被判犯有任何罪行,但一个人的自由利益可能会受到侵犯,结果引发了关于算法公平主题的辩论。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Investigating Algorithmic Risk and Race
Author(s): Hamilton, Melissa | Abstract: Risk assessment algorithms lie at the heart of criminal justice reform to tackle mass incarceration. The newest application of risk tools centers on the pretrial stage as a means to reduce both reliance upon wealth-based bail systems and rates of pretrial detention. Yet the ability of risk assessment to achieve the reform movement’s goals will be challengedif the risk tools do not perform equitably for minorities. To date, little is known about the racial fairness of these algorithms as they are used in the field. This Article offers an original empirical study of a popular risk assessment tool to evaluate its race-based performance. The case study is novel in employing a two-sample design with large datasets from diverse jurisdictions, one with a supermajority white population and the other a supermajority Black population.Statistical analyses examine whether, in these jurisdictions, the algorithmic risk tool results in disparate impact, exhibits test bias, or displays differential validity in terms of unequal performance metrics for white versus Black defendants. Implications of the study results are informative to the broader knowledge base about risk assessment practices in the field. Results contribute to the debate about the topic of algorithmic fairness in an important setting where one’s liberty interests may be infringed despite not being adjudicated guilty of any crime.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
I get Worried with This...Constitutionality by Statistics: A Critical Analysis of Discourse, Framing, and Discursive Strategies to Navigate Uncertainties in the Argersinger Oral Arguments The UCLA Law COVID Behind Bars Data Project: Doing Social Justice Work from Inside a Law School So Far, So Good: Enforcing California's Gun Violence Restraining Orders Before and After Bruen The Supreme Court's Second and Fifteenth Amendment Hypocrisy Could Shoot Down Voting Rights...and People "What Will Become of the Innocent?": Pretrial Detention, the Presumption of Innocence, and Punishment Before Trial
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1