Self-coding: A method to assess semantic validity and bias when coding open-ended responses

IF 2 3区 社会学 Q2 POLITICAL SCIENCE Research and Politics Pub Date : 2021-07-01 DOI:10.1177/20531680211031752
Rebecca A. Glazier, Amber E. Boydstun, Jessica T. Feezell
{"title":"Self-coding: A method to assess semantic validity and bias when coding open-ended responses","authors":"Rebecca A. Glazier, Amber E. Boydstun, Jessica T. Feezell","doi":"10.1177/20531680211031752","DOIUrl":null,"url":null,"abstract":"Open-ended survey questions can provide researchers with nuanced and rich data, but content analysis is subject to misinterpretation and can introduce bias into subsequent analysis. We present a simple method to improve the semantic validity of a codebook and test for bias: a “self-coding” method where respondents first provide open-ended responses and then self-code those responses into categories. We demonstrated this method by comparing respondents’ self-coding to researcher-based coding using an established codebook. Our analysis showed significant disagreement between the codebook’s assigned categorizations of responses and respondents’ self-codes. Moreover, this technique uncovered instances where researcher-based coding disproportionately misrepresented the views of certain demographic groups. We propose using the self-coding method to iteratively improve codebooks, identify bad-faith respondents, and, perhaps, to replace researcher-based content analysis.","PeriodicalId":37327,"journal":{"name":"Research and Politics","volume":" ","pages":""},"PeriodicalIF":2.0000,"publicationDate":"2021-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/20531680211031752","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research and Politics","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1177/20531680211031752","RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"POLITICAL SCIENCE","Score":null,"Total":0}
引用次数: 4

Abstract

Open-ended survey questions can provide researchers with nuanced and rich data, but content analysis is subject to misinterpretation and can introduce bias into subsequent analysis. We present a simple method to improve the semantic validity of a codebook and test for bias: a “self-coding” method where respondents first provide open-ended responses and then self-code those responses into categories. We demonstrated this method by comparing respondents’ self-coding to researcher-based coding using an established codebook. Our analysis showed significant disagreement between the codebook’s assigned categorizations of responses and respondents’ self-codes. Moreover, this technique uncovered instances where researcher-based coding disproportionately misrepresented the views of certain demographic groups. We propose using the self-coding method to iteratively improve codebooks, identify bad-faith respondents, and, perhaps, to replace researcher-based content analysis.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
自编码:一种在编码开放式回答时评估语义有效性和偏见的方法
开放式调查问题可以为研究人员提供细致入微、丰富的数据,但内容分析容易被误解,并可能在后续分析中引入偏见。我们提出了一种简单的方法来提高码本的语义有效性并测试偏见:一种“自编码”方法,受访者首先提供开放式回答,然后将这些回答自编码成类别。我们通过比较受访者的自我编码和使用已建立的码本的基于研究人员的编码来证明这种方法。我们的分析显示,代码本分配的回答分类与受访者的自我代码之间存在显著差异。此外,这项技术揭示了基于研究人员的编码不成比例地歪曲某些人口群体观点的例子。我们建议使用自编码方法来迭代改进代码簿,识别恶意受访者,也许还可以取代基于研究人员的内容分析。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Research and Politics
Research and Politics Social Sciences-Political Science and International Relations
CiteScore
2.80
自引率
3.70%
发文量
34
审稿时长
12 weeks
期刊介绍: Research & Politics aims to advance systematic peer-reviewed research in political science and related fields through the open access publication of the very best cutting-edge research and policy analysis. The journal provides a venue for scholars to communicate rapidly and succinctly important new insights to the broadest possible audience while maintaining the highest standards of quality control.
期刊最新文献
Voters don’t care too much about policy: How politicians conceive of voting motives Assessing survey mode effects in the 2019 EP elections: A comparison of online and face-to-face-survey data from six European countries Unexpected, but consistent and pre-registered: Experimental evidence on interview language and Latino views of COVID-19 Thinking generically and specifically in International Relations survey experiments Infectious disease and political violence: Evidence from malaria and civil conflicts in Sub-Saharan Africa
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1