Pub Date : 2023-03-13DOI: 10.1177/00491241231155883
Richard A. Berk, Arun Kumar Kuchibhotla, Eric Tchetgen Tchetgen
In the United States and elsewhere, risk assessment algorithms are being used to help inform criminal justice decision-makers. A common intent is to forecast an offender’s “future dangerousness.” S...
{"title":"Improving Fairness in Criminal Justice Algorithmic Risk Assessments Using Optimal Transport and Conformal Prediction Sets","authors":"Richard A. Berk, Arun Kumar Kuchibhotla, Eric Tchetgen Tchetgen","doi":"10.1177/00491241231155883","DOIUrl":"https://doi.org/10.1177/00491241231155883","url":null,"abstract":"In the United States and elsewhere, risk assessment algorithms are being used to help inform criminal justice decision-makers. A common intent is to forecast an offender’s “future dangerousness.” S...","PeriodicalId":21849,"journal":{"name":"Sociological Methods & Research","volume":"42 6","pages":""},"PeriodicalIF":6.3,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50167459","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-02-20DOI: 10.1177/00491241231156968
John D. McCluskey, Craig D. Uchida
Video data analysis (VDA) represents an important methodological framework for contemporary research approaches to the myriad of footage available from cameras, devices, and phones. Footage from police body-worn cameras (BWCs) is anticipated to be a widely available platform for social science researchers to scrutinize the interactions between police and citizens. We examine issues of validity and reliability as related to BWCs in the context of VDA, based on an assessment of the quality of audio and video obtained from that platform. Second, we compare the coding of BWC footage obtained from a sample of police-citizen encounters to coding of the same events by on-scene coders using an instrument adapted from in-person systematic social observations (SSOs). Findings show that there are substantial and systematic audio and video gaps present in BWC footage as a source of data for social science investigation that likely impact the reliability of measures. Despite these problems, BWC data have substantial capacity for judging sequential developments, causal ordering, and the duration of events. Thus, the technology should open theoretical frames that are too cumbersome for in-person observation. Theoretical development with VDA in mind is suggested as an important pathway for future researchers in terms of framing data collection from BWCs and also suggesting areas where triangulation is essential.
{"title":"Video Data Analysis and Police Body-Worn Camera Footage","authors":"John D. McCluskey, Craig D. Uchida","doi":"10.1177/00491241231156968","DOIUrl":"https://doi.org/10.1177/00491241231156968","url":null,"abstract":"Video data analysis (VDA) represents an important methodological framework for contemporary research approaches to the myriad of footage available from cameras, devices, and phones. Footage from police body-worn cameras (BWCs) is anticipated to be a widely available platform for social science researchers to scrutinize the interactions between police and citizens. We examine issues of validity and reliability as related to BWCs in the context of VDA, based on an assessment of the quality of audio and video obtained from that platform. Second, we compare the coding of BWC footage obtained from a sample of police-citizen encounters to coding of the same events by on-scene coders using an instrument adapted from in-person systematic social observations (SSOs). Findings show that there are substantial and systematic audio and video gaps present in BWC footage as a source of data for social science investigation that likely impact the reliability of measures. Despite these problems, BWC data have substantial capacity for judging sequential developments, causal ordering, and the duration of events. Thus, the technology should open theoretical frames that are too cumbersome for in-person observation. Theoretical development with VDA in mind is suggested as an important pathway for future researchers in terms of framing data collection from BWCs and also suggesting areas where triangulation is essential.","PeriodicalId":21849,"journal":{"name":"Sociological Methods & Research","volume":"52 1","pages":"1120 - 1154"},"PeriodicalIF":6.3,"publicationDate":"2023-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42377767","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-02-14DOI: 10.1177/00491241221147495
Yoav Goldstein, Nicolas M. Legewie, Doron Shiffer-Sebba
Video data offer important insights into social processes because they enable direct observation of real-life social interaction. Though such data have become abundant and increasingly accessible, they pose challenges to scalability and measurement. Computer vision (CV), i.e., software-based automated analysis of visual material, can help address these challenges, but existing CV tools are not sufficiently tailored to analyze social interactions. We describe our novel approach, “3D social research” (3DSR), which uses CV and 3D camera footage to study kinesics and proxemics, two core elements of social interaction. Using eight videos of a scripted interaction and five real-life street scene videos, we demonstrate how 3DSR expands sociologists’ analytical toolkit by facilitating a range of scalable and precise measurements. We specifically emphasize 3DSR's potential for analyzing physical distance, movement in space, and movement rate – important aspects of kinesics and proxemics in interactions. We also assess data reliability when using 3DSR.
{"title":"3D Social Research: Analysis of Social Interaction Using Computer Vision","authors":"Yoav Goldstein, Nicolas M. Legewie, Doron Shiffer-Sebba","doi":"10.1177/00491241221147495","DOIUrl":"https://doi.org/10.1177/00491241221147495","url":null,"abstract":"Video data offer important insights into social processes because they enable direct observation of real-life social interaction. Though such data have become abundant and increasingly accessible, they pose challenges to scalability and measurement. Computer vision (CV), i.e., software-based automated analysis of visual material, can help address these challenges, but existing CV tools are not sufficiently tailored to analyze social interactions. We describe our novel approach, “3D social research” (3DSR), which uses CV and 3D camera footage to study kinesics and proxemics, two core elements of social interaction. Using eight videos of a scripted interaction and five real-life street scene videos, we demonstrate how 3DSR expands sociologists’ analytical toolkit by facilitating a range of scalable and precise measurements. We specifically emphasize 3DSR's potential for analyzing physical distance, movement in space, and movement rate – important aspects of kinesics and proxemics in interactions. We also assess data reliability when using 3DSR.","PeriodicalId":21849,"journal":{"name":"Sociological Methods & Research","volume":"52 1","pages":"1201 - 1238"},"PeriodicalIF":6.3,"publicationDate":"2023-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45724889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-02-03DOI: 10.1177/00491241221140431
Iddo Tavory
Qualitative research is deceptively approachable. With no high-end statistics or computational methods, outsiders and novices alike often feel that they can judge such research “cold,” having neither thought much about it, much less practiced it. After all, they can read the text and understand it, especially when qualitative researchers often take pains to make their prose readable. This has unfortunate results: It creates a lot of random noise in evaluation, but it also means that evaluators will tend to revert to their implicit habits of evaluation —either based on prior theoretical and political commitments, or developed through work with very different methods—when they sit on recruitment, funding, or award committees. At the heart of Small and Calarco’s Qualitative Literacy there is thus a seemingly simple question: How do we know good qualitative research when we see it? How can we tell when it isn’t? When we teach and read quantitative research, we have a more-or-less agreed upon sense of the way methods should be used and evidence should be supported. While there is never complete agreement, reviews of quantitative work tend to converge around a statistically-defined shared set of standards. Qualitative research is a different beast. While qualitative researchers usually detect good research when they see it, they seem to have a harder time turning this implicit knowledge of craft into a set of guidelines. If the impetus of the book already makes it worthwhile, the key move it makes is as important: rather than gravitating towards quantitative standards and attempting to make qualitative research as close as possible to quantitative reasoning, Small and Calarco (much as Small did in his How many cases do I need?) are adamant that the standards are both rigorous, and quite different. Book Review Symposium: Qualitative Literacy
{"title":"Deceptively Approachable: Translating Standards in Qualitative Research","authors":"Iddo Tavory","doi":"10.1177/00491241221140431","DOIUrl":"https://doi.org/10.1177/00491241221140431","url":null,"abstract":"Qualitative research is deceptively approachable. With no high-end statistics or computational methods, outsiders and novices alike often feel that they can judge such research “cold,” having neither thought much about it, much less practiced it. After all, they can read the text and understand it, especially when qualitative researchers often take pains to make their prose readable. This has unfortunate results: It creates a lot of random noise in evaluation, but it also means that evaluators will tend to revert to their implicit habits of evaluation —either based on prior theoretical and political commitments, or developed through work with very different methods—when they sit on recruitment, funding, or award committees. At the heart of Small and Calarco’s Qualitative Literacy there is thus a seemingly simple question: How do we know good qualitative research when we see it? How can we tell when it isn’t? When we teach and read quantitative research, we have a more-or-less agreed upon sense of the way methods should be used and evidence should be supported. While there is never complete agreement, reviews of quantitative work tend to converge around a statistically-defined shared set of standards. Qualitative research is a different beast. While qualitative researchers usually detect good research when they see it, they seem to have a harder time turning this implicit knowledge of craft into a set of guidelines. If the impetus of the book already makes it worthwhile, the key move it makes is as important: rather than gravitating towards quantitative standards and attempting to make qualitative research as close as possible to quantitative reasoning, Small and Calarco (much as Small did in his How many cases do I need?) are adamant that the standards are both rigorous, and quite different. Book Review Symposium: Qualitative Literacy","PeriodicalId":21849,"journal":{"name":"Sociological Methods & Research","volume":"12 1","pages":"1043 - 1047"},"PeriodicalIF":6.3,"publicationDate":"2023-02-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87786054","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-12-07DOI: 10.1177/00491241221140142
Alina Arseniev-Koehler
Measuring meaning is a central problem in cultural sociology and word embeddings may offer powerful new tools to do so. But like any tool, they build on and exert theoretical assumptions. In this p...
{"title":"Theoretical Foundations and Limits of Word Embeddings: What Types of Meaning can They Capture?","authors":"Alina Arseniev-Koehler","doi":"10.1177/00491241221140142","DOIUrl":"https://doi.org/10.1177/00491241221140142","url":null,"abstract":"Measuring meaning is a central problem in cultural sociology and word embeddings may offer powerful new tools to do so. But like any tool, they build on and exert theoretical assumptions. In this p...","PeriodicalId":21849,"journal":{"name":"Sociological Methods & Research","volume":"54 10","pages":""},"PeriodicalIF":6.3,"publicationDate":"2022-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50167744","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-12-04DOI: 10.1177/00491241221134526
Salomé Do, Étienne Ollion, Rubing Shen
The last decade witnessed a spectacular rise in the volume of available textual data. With this new abundance came the question of how to analyze it. In the social sciences, scholars mostly resorte...
{"title":"The Augmented Social Scientist: Using Sequential Transfer Learning to Annotate Millions of Texts with Human-Level Accuracy","authors":"Salomé Do, Étienne Ollion, Rubing Shen","doi":"10.1177/00491241221134526","DOIUrl":"https://doi.org/10.1177/00491241221134526","url":null,"abstract":"The last decade witnessed a spectacular rise in the volume of available textual data. With this new abundance came the question of how to analyze it. In the social sciences, scholars mostly resorte...","PeriodicalId":21849,"journal":{"name":"Sociological Methods & Research","volume":"54 7","pages":""},"PeriodicalIF":6.3,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50167746","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-12-04DOI: 10.1177/00491241221140429
Colin Jerolmack
Ethnographic and interview research have made significant contributions to cumulative social science and influenced the public conversation around important social issues. However, debates rage over whether the standards of positivistic social science can or should be used to judge the rigor of interpretive methods. I begin this essay by briefly delineating the problem of developing evaluative criteria for qualitative research. I then explore the extent to which Small and Calarco's Qualitative Literacy helps advance a set of standards attuned to the distinct epistemology of interview and ethnographic methods. I argue that “qualitative literacy” is necessary but not sufficient to help readers decide whether a particular study is high quality. The reader also needs access to enough information about the researcher's data, field site, or subjects that she can independently reanalyze the researcher's interpretations and consider alternative explanations. I also touch on some important differences between ethnography and interviewing that matter for how we evaluate them.
{"title":"What Good is Qualitative Literacy Without Data Transparency?","authors":"Colin Jerolmack","doi":"10.1177/00491241221140429","DOIUrl":"https://doi.org/10.1177/00491241221140429","url":null,"abstract":"Ethnographic and interview research have made significant contributions to cumulative social science and influenced the public conversation around important social issues. However, debates rage over whether the standards of positivistic social science can or should be used to judge the rigor of interpretive methods. I begin this essay by briefly delineating the problem of developing evaluative criteria for qualitative research. I then explore the extent to which Small and Calarco's Qualitative Literacy helps advance a set of standards attuned to the distinct epistemology of interview and ethnographic methods. I argue that “qualitative literacy” is necessary but not sufficient to help readers decide whether a particular study is high quality. The reader also needs access to enough information about the researcher's data, field site, or subjects that she can independently reanalyze the researcher's interpretations and consider alternative explanations. I also touch on some important differences between ethnography and interviewing that matter for how we evaluate them.","PeriodicalId":21849,"journal":{"name":"Sociological Methods & Research","volume":"52 1","pages":"1059 - 1072"},"PeriodicalIF":6.3,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48199240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-12-04DOI: 10.1177/00491241221140425
Stefanie DeLuca
Increasingly, the broader public, media and policymakers are looking to qualitative research to provide answers to our most pressing social questions. While an exciting and perhaps overdue moment for qualitative researchers, it is also a time when the method is coming under increasing scrutiny for a lack of reliability and transparency. The question of how to assess the quality of qualitative research is therefore paramount, but the field still lacks clear standards to evaluate qualitative work. In their new book, Qualitative Literacy, Mario Luis Small and Jessica McCrory Calarco aim to fill this gap. I argue that Qualitative Literacy offers a compelling set of standards for consumers to assess whether an in-depth interview or participant observation was of sufficient quality and, to an extent, whether sufficient time was spent in the field. However, by ignoring the vital importance of employing systematic, well-justified, and transparent sampling strategies, the implication is that such essential criteria can be ignored, undermining the potential contribution of qualitative research to a more cumulative creation of scientific knowledge.
{"title":"Sample Selection Matters: Moving Toward Empirically Sound Qualitative Research","authors":"Stefanie DeLuca","doi":"10.1177/00491241221140425","DOIUrl":"https://doi.org/10.1177/00491241221140425","url":null,"abstract":"Increasingly, the broader public, media and policymakers are looking to qualitative research to provide answers to our most pressing social questions. While an exciting and perhaps overdue moment for qualitative researchers, it is also a time when the method is coming under increasing scrutiny for a lack of reliability and transparency. The question of how to assess the quality of qualitative research is therefore paramount, but the field still lacks clear standards to evaluate qualitative work. In their new book, Qualitative Literacy, Mario Luis Small and Jessica McCrory Calarco aim to fill this gap. I argue that Qualitative Literacy offers a compelling set of standards for consumers to assess whether an in-depth interview or participant observation was of sufficient quality and, to an extent, whether sufficient time was spent in the field. However, by ignoring the vital importance of employing systematic, well-justified, and transparent sampling strategies, the implication is that such essential criteria can be ignored, undermining the potential contribution of qualitative research to a more cumulative creation of scientific knowledge.","PeriodicalId":21849,"journal":{"name":"Sociological Methods & Research","volume":"52 1","pages":"1073 - 1085"},"PeriodicalIF":6.3,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42275990","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-12-04DOI: 10.1177/00491241221140139
Alexandru Cernat, Joseph Sakshaug, Pablo Christmann, Tobias Gummer
Mixed-mode surveys are popular as they can save costs and maintain (or improve) response rates relative to single-mode surveys. Nevertheless, it is not yet clear how design decisions like survey mo...
{"title":"The Impact of Survey Mode Design and Questionnaire Length on Measurement Quality","authors":"Alexandru Cernat, Joseph Sakshaug, Pablo Christmann, Tobias Gummer","doi":"10.1177/00491241221140139","DOIUrl":"https://doi.org/10.1177/00491241221140139","url":null,"abstract":"Mixed-mode surveys are popular as they can save costs and maintain (or improve) response rates relative to single-mode surveys. Nevertheless, it is not yet clear how design decisions like survey mo...","PeriodicalId":21849,"journal":{"name":"Sociological Methods & Research","volume":"54 9","pages":""},"PeriodicalIF":6.3,"publicationDate":"2022-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50167745","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-11-29DOI: 10.1177/00491241221140427
J. Katz
Taking a sociological view, we can investigate the empirical consequences of variations in the rhetoric of sociological methodology. The standards advocated in Qualitative Literacy divide communities of qualitative researchers, as they are not explicitly connected to an understanding of social ontology, unlike previous qualitative methodologies; they continue the long-growing segregation of the rhetorical worlds of qualitative and quantitative research methodology; and they draw attention to the personal competencies of the researcher. I compare a rhetoric of qualitative methodology that: derives evaluation criteria from perspectives on social ontology that have been developing progressively since the early twentieth century; applies the discipline-wide evaluation criteria of reactivity, reliability, representativeness, and replicability; and asks evaluators to focus on the adequacy of the textual depiction of research subjects.
{"title":"The Sociological Power of Methodological Rhetoric","authors":"J. Katz","doi":"10.1177/00491241221140427","DOIUrl":"https://doi.org/10.1177/00491241221140427","url":null,"abstract":"Taking a sociological view, we can investigate the empirical consequences of variations in the rhetoric of sociological methodology. The standards advocated in Qualitative Literacy divide communities of qualitative researchers, as they are not explicitly connected to an understanding of social ontology, unlike previous qualitative methodologies; they continue the long-growing segregation of the rhetorical worlds of qualitative and quantitative research methodology; and they draw attention to the personal competencies of the researcher. I compare a rhetoric of qualitative methodology that: derives evaluation criteria from perspectives on social ontology that have been developing progressively since the early twentieth century; applies the discipline-wide evaluation criteria of reactivity, reliability, representativeness, and replicability; and asks evaluators to focus on the adequacy of the textual depiction of research subjects.","PeriodicalId":21849,"journal":{"name":"Sociological Methods & Research","volume":"52 1","pages":"1086 - 1102"},"PeriodicalIF":6.3,"publicationDate":"2022-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45644620","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}