Pub Date : 2023-10-01DOI: 10.1215/2834703x-10734056
Caroline E. Schuster, Kristen M. Schuster
Abstract This article argues that critical AI studies should make a methodological investment in “thick description” to counteract the tendency both within computational design and business settings to presume (or, in the case of start-ups, hope for) a seamless and inevitable journey from data to monetizable domain knowledge and useful services. Perhaps the classic application of that critical data-studies framework is Marion Fourcade and Kevin Healy's influential 2017 essay, “Seeing Like a Market,” which advances a comprehensive account of how value is extracted from data-collection processes. As important as these critiques have been, the apparent inevitability of this assemblage of power, knowledge, and profit arises in part through the metaphor of “sight.” Thick description—especially when combined with a feminist and queer attention to embodiment, materiality, and multisensory experience—can in this respect supplement Fourcade and Healey's critique by revealing unexpected imaginative possibilities built out of social materialities.
本文认为,关键的人工智能研究应该在方法论上投资于“厚描述”,以抵消计算设计和商业环境中假设(或者,在初创企业的情况下,希望)从数据到可货币化的领域知识和有用服务的无缝和不可避免的旅程的趋势。也许这一关键数据研究框架的经典应用是Marion Fourcade和Kevin Healy在2017年发表的有影响力的文章《像市场一样看》(Seeing Like a Market),这篇文章全面阐述了如何从数据收集过程中提取价值。与这些批评同样重要的是,这种权力、知识和利益的集合显然是不可避免的,部分是通过“视觉”的隐喻产生的。厚重的描述——尤其是当与女权主义者和酷儿对化身、物质性和多感官体验的关注结合在一起时——可以在这方面补充Fourcade和Healey的批判,揭示出基于社会物质性的意想不到的想象可能性。
{"title":"Thick Description for Critical AI: Generating Data Capitalism and Provocations for a Multisensory Approach","authors":"Caroline E. Schuster, Kristen M. Schuster","doi":"10.1215/2834703x-10734056","DOIUrl":"https://doi.org/10.1215/2834703x-10734056","url":null,"abstract":"Abstract This article argues that critical AI studies should make a methodological investment in “thick description” to counteract the tendency both within computational design and business settings to presume (or, in the case of start-ups, hope for) a seamless and inevitable journey from data to monetizable domain knowledge and useful services. Perhaps the classic application of that critical data-studies framework is Marion Fourcade and Kevin Healy's influential 2017 essay, “Seeing Like a Market,” which advances a comprehensive account of how value is extracted from data-collection processes. As important as these critiques have been, the apparent inevitability of this assemblage of power, knowledge, and profit arises in part through the metaphor of “sight.” Thick description—especially when combined with a feminist and queer attention to embodiment, materiality, and multisensory experience—can in this respect supplement Fourcade and Healey's critique by revealing unexpected imaginative possibilities built out of social materialities.","PeriodicalId":500906,"journal":{"name":"Critical AI","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135457658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-01DOI: 10.1215/2834703x-10734036
Kristin Rose, Kate Henne, Sabelo Mhlambi, Anand Sarwate, Sasha Costanza-Chock
{"title":"Critical AI and Design Justice: An Interview with Sasha Costanza-Chock","authors":"Kristin Rose, Kate Henne, Sabelo Mhlambi, Anand Sarwate, Sasha Costanza-Chock","doi":"10.1215/2834703x-10734036","DOIUrl":"https://doi.org/10.1215/2834703x-10734036","url":null,"abstract":"","PeriodicalId":500906,"journal":{"name":"Critical AI","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135457820","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-10-01DOI: 10.1215/2834703x-10734016
Lauren M. E. Goodlad
Abstract This editor's introduction welcomes readers to a new interdisciplinary undertaking. The community of practice Critical AI addresses hopes to bring critical thinking of the kind that interpretive disciplines foster into dialogue with work by technologists and others who share the understanding of interdisciplinary research as a powerful tool for building accountable technology in the public interest. Critical AI studies aims to shape and activate conversations in academia, industry, policymaking, media, and the public at large. The long and ongoing history of “AI,” including the data-driven technologies that now claim that name, remains riddled by three core dilemmas: (1) reductive and controversial meanings of “intelligence”; (2) problematic benchmarks and tests for supposedly scientific terms such as “AGI”; and (3) bias, errors, stereotypes, and concentration of power. AI hype today is steeped in blends of utopian and dystopian discourse that distract from the real-world harms of existing technologies. In reality, what is hyped and anthropomorphized as “AI” and even “AGI” is the product not only of technology companies and investors but also—and more fundamentally—of the many millions of people and communities subject to copyright infringement, nonconsensual use of data, bias, environmental harms, and the low-wage and high-stress modes of “human in the loop” through which systems for probabilistic mimicry improve their performance in an imitation game.
{"title":"Editor's Introduction: Humanities in the Loop","authors":"Lauren M. E. Goodlad","doi":"10.1215/2834703x-10734016","DOIUrl":"https://doi.org/10.1215/2834703x-10734016","url":null,"abstract":"Abstract This editor's introduction welcomes readers to a new interdisciplinary undertaking. The community of practice Critical AI addresses hopes to bring critical thinking of the kind that interpretive disciplines foster into dialogue with work by technologists and others who share the understanding of interdisciplinary research as a powerful tool for building accountable technology in the public interest. Critical AI studies aims to shape and activate conversations in academia, industry, policymaking, media, and the public at large. The long and ongoing history of “AI,” including the data-driven technologies that now claim that name, remains riddled by three core dilemmas: (1) reductive and controversial meanings of “intelligence”; (2) problematic benchmarks and tests for supposedly scientific terms such as “AGI”; and (3) bias, errors, stereotypes, and concentration of power. AI hype today is steeped in blends of utopian and dystopian discourse that distract from the real-world harms of existing technologies. In reality, what is hyped and anthropomorphized as “AI” and even “AGI” is the product not only of technology companies and investors but also—and more fundamentally—of the many millions of people and communities subject to copyright infringement, nonconsensual use of data, bias, environmental harms, and the low-wage and high-stress modes of “human in the loop” through which systems for probabilistic mimicry improve their performance in an imitation game.","PeriodicalId":500906,"journal":{"name":"Critical AI","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135457826","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}