Data trusts are an increasingly popular proposal for managing complex data governance questions, although what they are remains contested. Sidewalk Labs proposed creating an “Urban Data Trust” as part of the Sidewalk Toronto “smart” redevelopment of a portion of Toronto’s waterfront. This part of its proposal was rejected before Sidewalk Labs cancelled the project. This research note briefly places the Urban Data Trust within the general debate regarding data trusts and then discusses one set of reasons for its failure: its incoherence as a model. The Urban Data Trust was a failed model because it lacked clarity regarding the nature of the problem(s) to which it is a solution, how accountability and oversight are secured, and its relation to existing data protection law. These are important lessons for the more general debate regarding data trusts and their role in data governance.
{"title":"Data Trusts and the Governance of Smart Environments: Lessons from the Failure of Sidewalk Labs’ Urban Data Trust","authors":"Lisa M. Austin, D. Lie","doi":"10.24908/ss.v19i2.14409","DOIUrl":"https://doi.org/10.24908/ss.v19i2.14409","url":null,"abstract":"Data trusts are an increasingly popular proposal for managing complex data governance questions, although what they are remains contested. Sidewalk Labs proposed creating an “Urban Data Trust” as part of the Sidewalk Toronto “smart” redevelopment of a portion of Toronto’s waterfront. This part of its proposal was rejected before Sidewalk Labs cancelled the project. This research note briefly places the Urban Data Trust within the general debate regarding data trusts and then discusses one set of reasons for its failure: its incoherence as a model. The Urban Data Trust was a failed model because it lacked clarity regarding the nature of the problem(s) to which it is a solution, how accountability and oversight are secured, and its relation to existing data protection law. These are important lessons for the more general debate regarding data trusts and their role in data governance. \u0000 ","PeriodicalId":47078,"journal":{"name":"Surveillance & Society","volume":" ","pages":""},"PeriodicalIF":2.0,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47061833","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In the last decade and more recently triggered by the COVID-19 pandemic, algorithmic surveillance technologies have been increasingly implemented and experimented with by the police for crime control, public order policing, and as management tools. Police departments are also increasingly consumers of surveillance technologies that are created, sold, and controlled by private companies. They exercise an undue influence over police today in ways that are not widely acknowledged and increasingly play a role in the data capture and processing that feeds into larger cloud infrastructures and data markets. These developments are having profound effects on how policing is organized and on existing power relations, whereby decisions are increasingly being made by algorithms. Although attention is paid to algorithmic police surveillance in academic research as well as in mainstream media, critical discussions about its democratic oversight are rare. The goal of this paper is to contribute to ongoing research on police and surveillance oversight and to question how current judicial oversight of algorithmic police surveillance in Belgium addresses socio-technical harms of these surveillance practices.
{"title":"How to Watch the Watchers? Democratic Oversight of Algorithmic Police Surveillance in Belgium","authors":"R. V. Brakel","doi":"10.24908/ss.v19i2.14325","DOIUrl":"https://doi.org/10.24908/ss.v19i2.14325","url":null,"abstract":"In the last decade and more recently triggered by the COVID-19 pandemic, algorithmic surveillance technologies have been increasingly implemented and experimented with by the police for crime control, public order policing, and as management tools. Police departments are also increasingly consumers of surveillance technologies that are created, sold, and controlled by private companies. They exercise an undue influence over police today in ways that are not widely acknowledged and increasingly play a role in the data capture and processing that feeds into larger cloud infrastructures and data markets. These developments are having profound effects on how policing is organized and on existing power relations, whereby decisions are increasingly being made by algorithms. Although attention is paid to algorithmic police surveillance in academic research as well as in mainstream media, critical discussions about its democratic oversight are rare. The goal of this paper is to contribute to ongoing research on police and surveillance oversight and to question how current judicial oversight of algorithmic police surveillance in Belgium addresses socio-technical harms of these surveillance practices.","PeriodicalId":47078,"journal":{"name":"Surveillance & Society","volume":" ","pages":""},"PeriodicalIF":2.0,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48855828","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In early 2016, the city of New York and the Google-backed consortium CityBridge launched LinkNYC, a communication network that enables residents and visitors to access Wi-Fi, charge their phones, and make domestic calls—all for free. The ten-feet tall kiosks scattered around the city are also equipped with screens, cameras, a tablet, speakers, and a microphone. Almost immediately after its launch, many raised concerns about LinkNYC: noise complaints concerning users listening to loud music, homeless people gathering around the kiosks, outrage regarding users watching pornography, as well as the potential threat to privacy the kiosks present. In this paper, I argue that LinkNYC functions as a neoliberal apparatus of listening and silencing in the public sphere through data collection and restrictions of usage of the kiosk in the name of accessibility. As Google’s first attempt at occupying the public space, LinkNYC reveals the aspirations for the neoliberal city. Through an ethnographic socio-technological study of LinkNYC, I engage sound studies in current discussions about surveillance. I theorize the modalities of listening in the neoliberal city and discuss competing notions of the public space in smart/responsive cities. I investigate the ideological difference between the smart city and the responsive city and trace the movement from a listening entity to a responsive one, analyzing the implications for privacy. I theorize unsilencing and its politics, discussing examples of re-appropriation of the kiosks. I conducted fieldwork by observing interactions with the kiosks and by doing interviews with citizens, homeless advocacy groups, CityBridge employees, and experts. In addition, I analyze the discourses of CityBridge, local politicians, activists, journalists, and citizens surrounding LinkNYC. This paper is at the theoretical intersection of sound studies, urban studies, science and technology studies, and surveillance studies. Through this case study, I open a theorization of the listening practices of surveillance to look at how power circulates through sound.
{"title":"The Noise of Silent Machines: A Case Study of LinkNYC","authors":"Audrey Amsellem","doi":"10.24908/ss.v19i2.14302","DOIUrl":"https://doi.org/10.24908/ss.v19i2.14302","url":null,"abstract":"In early 2016, the city of New York and the Google-backed consortium CityBridge launched LinkNYC, a communication network that enables residents and visitors to access Wi-Fi, charge their phones, and make domestic calls—all for free. The ten-feet tall kiosks scattered around the city are also equipped with screens, cameras, a tablet, speakers, and a microphone. Almost immediately after its launch, many raised concerns about LinkNYC: noise complaints concerning users listening to loud music, homeless people gathering around the kiosks, outrage regarding users watching pornography, as well as the potential threat to privacy the kiosks present. In this paper, I argue that LinkNYC functions as a neoliberal apparatus of listening and silencing in the public sphere through data collection and restrictions of usage of the kiosk in the name of accessibility. As Google’s first attempt at occupying the public space, LinkNYC reveals the aspirations for the neoliberal city. Through an ethnographic socio-technological study of LinkNYC, I engage sound studies in current discussions about surveillance. I theorize the modalities of listening in the neoliberal city and discuss competing notions of the public space in smart/responsive cities. I investigate the ideological difference between the smart city and the responsive city and trace the movement from a listening entity to a responsive one, analyzing the implications for privacy. I theorize unsilencing and its politics, discussing examples of re-appropriation of the kiosks. I conducted fieldwork by observing interactions with the kiosks and by doing interviews with citizens, homeless advocacy groups, CityBridge employees, and experts. In addition, I analyze the discourses of CityBridge, local politicians, activists, journalists, and citizens surrounding LinkNYC. This paper is at the theoretical intersection of sound studies, urban studies, science and technology studies, and surveillance studies. Through this case study, I open a theorization of the listening practices of surveillance to look at how power circulates through sound.","PeriodicalId":47078,"journal":{"name":"Surveillance & Society","volume":" ","pages":""},"PeriodicalIF":2.0,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46977014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We are witnessing an upsurge in crime forecasting software, which supposedly draws predictive knowledge from data on past crime. Although prevention and anticipation are already embedded in the apparatuses of government, going beyond a mere abstract aspiration, the latest innovations hold out the promise of replacing police officers’ “gut feelings” and discretionary risk assessments with algorithmic-powered, quantified analyses of risk scores. While police departments and private companies praise such innovations for their cost-effective rationale, critics raise concerns regarding their potential for discriminating against poor, black, and migrant communities. In this article, I address such controversies by telling the story of the making of CrimeRadar, an app developed by a Rio de Janeiro-based think tank in partnership with private associates and local police authorities. Drawing mostly on Latour’s contributions to the emerging literature on security assemblages, I argue that we gain explanatory and critical leverage by looking into the mundane practices of making and unmaking sociotechnical arrangements. That is, I address the chain of translations through which crime data are collected, organized, and transformed into risk scores. In every step, new ways of seeing and presenting crime are produced, with a significant impact on how we experience and act upon (in)security.
{"title":"The The Making of Crime Predictions: Sociotechnical Assemblages and the Controversies of Governing Future Crime","authors":"Daniel Edler Duarte","doi":"10.24908/ss.v19i2.14261","DOIUrl":"https://doi.org/10.24908/ss.v19i2.14261","url":null,"abstract":"We are witnessing an upsurge in crime forecasting software, which supposedly draws predictive knowledge from data on past crime. Although prevention and anticipation are already embedded in the apparatuses of government, going beyond a mere abstract aspiration, the latest innovations hold out the promise of replacing police officers’ “gut feelings” and discretionary risk assessments with algorithmic-powered, quantified analyses of risk scores. While police departments and private companies praise such innovations for their cost-effective rationale, critics raise concerns regarding their potential for discriminating against poor, black, and migrant communities. In this article, I address such controversies by telling the story of the making of CrimeRadar, an app developed by a Rio de Janeiro-based think tank in partnership with private associates and local police authorities. Drawing mostly on Latour’s contributions to the emerging literature on security assemblages, I argue that we gain explanatory and critical leverage by looking into the mundane practices of making and unmaking sociotechnical arrangements. That is, I address the chain of translations through which crime data are collected, organized, and transformed into risk scores. In every step, new ways of seeing and presenting crime are produced, with a significant impact on how we experience and act upon (in)security.","PeriodicalId":47078,"journal":{"name":"Surveillance & Society","volume":"26 4","pages":"199-215"},"PeriodicalIF":2.0,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138504074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Review of Pallitto’s Bargaining with the Machine: Technology, Surveillance, and the Social Contract","authors":"Breigha Adeyemo","doi":"10.24908/ss.v19i2.14739","DOIUrl":"https://doi.org/10.24908/ss.v19i2.14739","url":null,"abstract":"","PeriodicalId":47078,"journal":{"name":"Surveillance & Society","volume":" ","pages":""},"PeriodicalIF":2.0,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44803018","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article analyzes the increasing articulation between third-party software and emergency infrastructures through a focus on the computer-assisted dispatch system RapidDeploy, which purports to help 9-1-1 responders more accurately and efficiently respond to emergency situations. We build from research that focuses on overlaps between surveillance and emergency response to demonstrate how large-scale data mining practices—in particular, location extraction—are repurposed as beneficial, if not life-saving, measures. In focusing on the capacity to extract and analyze location in the name of public good, we discursively analyze how RapidDeploy’s official company blog constructs the benevolence of data collection. We then demonstrate how these viewpoints are reproduced in the news reports in one city—Charleston, South Carolina—where RapidDeploy has formally partnered with emergency response services. This analysis is used to demonstrate how words like “data” and “cloud” continue to be used as vague buzzwords for companies situated at the intersections of surveillance and civic function, and to argue for greater attention to how the trend towards platformization continues to blur the relationships between surveillance and emergency response systems. This case study examines the public face of this company and, as such, analyzes the language used to gain public assent for the software and its function.
{"title":"Emergency Infrastructure and Locational Extraction: Problematizing Computer Assisted Dispatch Systems as Public Good","authors":"James N. Gilmore, McKinley DuRant","doi":"10.24908/ss.v19i2.14116","DOIUrl":"https://doi.org/10.24908/ss.v19i2.14116","url":null,"abstract":"This article analyzes the increasing articulation between third-party software and emergency infrastructures through a focus on the computer-assisted dispatch system RapidDeploy, which purports to help 9-1-1 responders more accurately and efficiently respond to emergency situations. We build from research that focuses on overlaps between surveillance and emergency response to demonstrate how large-scale data mining practices—in particular, location extraction—are repurposed as beneficial, if not life-saving, measures. In focusing on the capacity to extract and analyze location in the name of public good, we discursively analyze how RapidDeploy’s official company blog constructs the benevolence of data collection. We then demonstrate how these viewpoints are reproduced in the news reports in one city—Charleston, South Carolina—where RapidDeploy has formally partnered with emergency response services. This analysis is used to demonstrate how words like “data” and “cloud” continue to be used as vague buzzwords for companies situated at the intersections of surveillance and civic function, and to argue for greater attention to how the trend towards platformization continues to blur the relationships between surveillance and emergency response systems. This case study examines the public face of this company and, as such, analyzes the language used to gain public assent for the software and its function.","PeriodicalId":47078,"journal":{"name":"Surveillance & Society","volume":" ","pages":""},"PeriodicalIF":2.0,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44846667","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Maria Rita Pereira Xavier, Ana Paula Ferreira Felizardo, Fábio Wellington Ataíde Alves
This paper discusses the electronic monitoring (EM) of indicted and convicted citizens in Brazil during the COVID-19 pandemic. We start by discussing how EM was implemented in the country and describing its close link with the technology company Spacecom. We argue that the use of EM to mitigate the impact of COVID-19 in the Brazilian prison system intensifies the continuation of an uninterrupted mechanism of social control that is sustained by systemic racism in Brazil through a growing link between the State and technology companies. Mapping the changes that EM imposes on criminal legal practices, reflecting on data access and management carried out by private companies, and analyzing the acceleration of this process during the COVID-19 pandemic in Brazil are topics addressed herein.
{"title":"Smart Prisoners: Uses of Electronic Monitoring in Brazilian Prisons during the COVID-19 Pandemic","authors":"Maria Rita Pereira Xavier, Ana Paula Ferreira Felizardo, Fábio Wellington Ataíde Alves","doi":"10.24908/ss.v19i2.14303","DOIUrl":"https://doi.org/10.24908/ss.v19i2.14303","url":null,"abstract":"This paper discusses the electronic monitoring (EM) of indicted and convicted citizens in Brazil during the COVID-19 pandemic. We start by discussing how EM was implemented in the country and describing its close link with the technology company Spacecom. We argue that the use of EM to mitigate the impact of COVID-19 in the Brazilian prison system intensifies the continuation of an uninterrupted mechanism of social control that is sustained by systemic racism in Brazil through a growing link between the State and technology companies. Mapping the changes that EM imposes on criminal legal practices, reflecting on data access and management carried out by private companies, and analyzing the acceleration of this process during the COVID-19 pandemic in Brazil are topics addressed herein.","PeriodicalId":47078,"journal":{"name":"Surveillance & Society","volume":" ","pages":""},"PeriodicalIF":2.0,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44575128","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Case for a Ban on Facial Recognition Surveillance in Canada","authors":"Tim McSorley","doi":"10.24908/ss.v19i2.14777","DOIUrl":"https://doi.org/10.24908/ss.v19i2.14777","url":null,"abstract":"","PeriodicalId":47078,"journal":{"name":"Surveillance & Society","volume":"1 1","pages":""},"PeriodicalIF":2.0,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"69153232","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Review of Mountz’s The Death of Asylum: The Hidden Geographies of the Enforcement Archipelago","authors":"Brandy Cochrane","doi":"10.24908/ss.v19i2.14528","DOIUrl":"https://doi.org/10.24908/ss.v19i2.14528","url":null,"abstract":"chain of state border practices on","PeriodicalId":47078,"journal":{"name":"Surveillance & Society","volume":" ","pages":""},"PeriodicalIF":2.0,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47560561","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}