{"title":"预测性监管改革?大数据警务中的不确定性与干预","authors":"Aaron Shapiro","doi":"10.24908/ss.v17i3/4.10410","DOIUrl":null,"url":null,"abstract":"Predictive analytics and artificial intelligence are applied widely across law enforcement agencies and the criminal justice system. Despite criticism that such tools reinforce inequality and structural discrimination, proponents insist that they will nonetheless improve the equality and fairness of outcomes by countering humans’ biased or capricious decision-making. How can predictive analytics be understood simultaneously as a source of, and solution to, discrimination and bias in criminal justice and law enforcement? The article provides a framework for understanding the techno-political gambit of predictive policing as a mechanism of police reform—a discourse that I call “predictive policing for reform.” Focusing specifically on geospatial predictive policing systems, I argue that “predictive policing for reform” should be seen as a flawed attempt to rationalize police patrols through an algorithmic remediation of patrol geographies. The attempt is flawed because predictive systems operate on the sociotechnical practices of police patrols, which are themselves contradictory enactments of the state’s power to distribute safety and harm. The ambiguities and contradictions of the patrol are not resolved through algorithmic remediation. Instead, they lead to new indeterminacies, trade-offs, and experimentations based on unfalsifiable claims. I detail these through a discussion of predictive policing firm HunchLab’s use of predictive analytics to rationalize patrols and mitigate bias. Understanding how the “predictive policing for reform” discourse is operationalized as a series of technical fixes that rely on the production of indeterminacies allows for a more nuanced critique of predictive policing.","PeriodicalId":47078,"journal":{"name":"Surveillance & Society","volume":" ","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2019-09-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.24908/ss.v17i3/4.10410","citationCount":"28","resultStr":"{\"title\":\"Predictive Policing for Reform? Indeterminacy and Intervention in Big Data Policing\",\"authors\":\"Aaron Shapiro\",\"doi\":\"10.24908/ss.v17i3/4.10410\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Predictive analytics and artificial intelligence are applied widely across law enforcement agencies and the criminal justice system. Despite criticism that such tools reinforce inequality and structural discrimination, proponents insist that they will nonetheless improve the equality and fairness of outcomes by countering humans’ biased or capricious decision-making. How can predictive analytics be understood simultaneously as a source of, and solution to, discrimination and bias in criminal justice and law enforcement? The article provides a framework for understanding the techno-political gambit of predictive policing as a mechanism of police reform—a discourse that I call “predictive policing for reform.” Focusing specifically on geospatial predictive policing systems, I argue that “predictive policing for reform” should be seen as a flawed attempt to rationalize police patrols through an algorithmic remediation of patrol geographies. The attempt is flawed because predictive systems operate on the sociotechnical practices of police patrols, which are themselves contradictory enactments of the state’s power to distribute safety and harm. The ambiguities and contradictions of the patrol are not resolved through algorithmic remediation. Instead, they lead to new indeterminacies, trade-offs, and experimentations based on unfalsifiable claims. I detail these through a discussion of predictive policing firm HunchLab’s use of predictive analytics to rationalize patrols and mitigate bias. Understanding how the “predictive policing for reform” discourse is operationalized as a series of technical fixes that rely on the production of indeterminacies allows for a more nuanced critique of predictive policing.\",\"PeriodicalId\":47078,\"journal\":{\"name\":\"Surveillance & Society\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2019-09-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.24908/ss.v17i3/4.10410\",\"citationCount\":\"28\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Surveillance & Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.24908/ss.v17i3/4.10410\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"SOCIAL SCIENCES, INTERDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Surveillance & Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.24908/ss.v17i3/4.10410","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SOCIAL SCIENCES, INTERDISCIPLINARY","Score":null,"Total":0}
Predictive Policing for Reform? Indeterminacy and Intervention in Big Data Policing
Predictive analytics and artificial intelligence are applied widely across law enforcement agencies and the criminal justice system. Despite criticism that such tools reinforce inequality and structural discrimination, proponents insist that they will nonetheless improve the equality and fairness of outcomes by countering humans’ biased or capricious decision-making. How can predictive analytics be understood simultaneously as a source of, and solution to, discrimination and bias in criminal justice and law enforcement? The article provides a framework for understanding the techno-political gambit of predictive policing as a mechanism of police reform—a discourse that I call “predictive policing for reform.” Focusing specifically on geospatial predictive policing systems, I argue that “predictive policing for reform” should be seen as a flawed attempt to rationalize police patrols through an algorithmic remediation of patrol geographies. The attempt is flawed because predictive systems operate on the sociotechnical practices of police patrols, which are themselves contradictory enactments of the state’s power to distribute safety and harm. The ambiguities and contradictions of the patrol are not resolved through algorithmic remediation. Instead, they lead to new indeterminacies, trade-offs, and experimentations based on unfalsifiable claims. I detail these through a discussion of predictive policing firm HunchLab’s use of predictive analytics to rationalize patrols and mitigate bias. Understanding how the “predictive policing for reform” discourse is operationalized as a series of technical fixes that rely on the production of indeterminacies allows for a more nuanced critique of predictive policing.