社交媒体情感注释指南(SMEmo):开发和初步有效性。

IF 4.6 2区 心理学 Q1 PSYCHOLOGY, EXPERIMENTAL Behavior Research Methods Pub Date : 2024-08-01 Epub Date: 2023-09-11 DOI:10.3758/s13428-023-02195-1
Susannah B F Paletz, Ewa M Golonka, Nick B Pandža, Grace Stanton, David Ryan, Nikki Adams, C Anton Rytting, Egle E Murauskaite, Cody Buntain, Michael A Johns, Petra Bradley
{"title":"社交媒体情感注释指南(SMEmo):开发和初步有效性。","authors":"Susannah B F Paletz, Ewa M Golonka, Nick B Pandža, Grace Stanton, David Ryan, Nikki Adams, C Anton Rytting, Egle E Murauskaite, Cody Buntain, Michael A Johns, Petra Bradley","doi":"10.3758/s13428-023-02195-1","DOIUrl":null,"url":null,"abstract":"<p><p>The proper measurement of emotion is vital to understanding the relationship between emotional expression in social media and other factors, such as online information sharing. This work develops a standardized annotation scheme for quantifying emotions in social media using recent emotion theory and research. Human annotators assessed both social media posts and their own reactions to the posts' content on scales of 0 to 100 for each of 20 (Study 1) and 23 (Study 2) emotions. For Study 1, we analyzed English-language posts from Twitter (N = 244) and YouTube (N = 50). Associations between emotion ratings and text-based measures (LIWC, VADER, EmoLex, NRC-EIL, Emotionality) demonstrated convergent and discriminant validity. In Study 2, we tested an expanded version of the scheme in-country, in-language, on Polish (N = 3648) and Lithuanian (N = 1934) multimedia Facebook posts. While the correlations were lower than with English, patterns of convergent and discriminant validity with EmoLex and NRC-EIL still held. Coder reliability was strong across samples, with intraclass correlations of .80 or higher for 10 different emotions in Study 1 and 16 different emotions in Study 2. This research improves the measurement of emotions in social media to include more dimensions, multimedia, and context compared to prior schemes.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":" ","pages":"4435-4485"},"PeriodicalIF":4.6000,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Social media emotions annotation guide (SMEmo): Development and initial validity.\",\"authors\":\"Susannah B F Paletz, Ewa M Golonka, Nick B Pandža, Grace Stanton, David Ryan, Nikki Adams, C Anton Rytting, Egle E Murauskaite, Cody Buntain, Michael A Johns, Petra Bradley\",\"doi\":\"10.3758/s13428-023-02195-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The proper measurement of emotion is vital to understanding the relationship between emotional expression in social media and other factors, such as online information sharing. This work develops a standardized annotation scheme for quantifying emotions in social media using recent emotion theory and research. Human annotators assessed both social media posts and their own reactions to the posts' content on scales of 0 to 100 for each of 20 (Study 1) and 23 (Study 2) emotions. For Study 1, we analyzed English-language posts from Twitter (N = 244) and YouTube (N = 50). Associations between emotion ratings and text-based measures (LIWC, VADER, EmoLex, NRC-EIL, Emotionality) demonstrated convergent and discriminant validity. In Study 2, we tested an expanded version of the scheme in-country, in-language, on Polish (N = 3648) and Lithuanian (N = 1934) multimedia Facebook posts. While the correlations were lower than with English, patterns of convergent and discriminant validity with EmoLex and NRC-EIL still held. Coder reliability was strong across samples, with intraclass correlations of .80 or higher for 10 different emotions in Study 1 and 16 different emotions in Study 2. This research improves the measurement of emotions in social media to include more dimensions, multimedia, and context compared to prior schemes.</p>\",\"PeriodicalId\":8717,\"journal\":{\"name\":\"Behavior Research Methods\",\"volume\":\" \",\"pages\":\"4435-4485\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2024-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Behavior Research Methods\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.3758/s13428-023-02195-1\",\"RegionNum\":2,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2023/9/11 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Behavior Research Methods","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.3758/s13428-023-02195-1","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/9/11 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

摘要

正确测量情感对于理解社交媒体中的情感表达与在线信息共享等其他因素之间的关系至关重要。这项研究利用最新的情感理论和研究成果,开发了一套标准化的注释方案,用于量化社交媒体中的情感。人类注释者对社交媒体帖子和他们自己对帖子内容的反应进行评估,对 20 种情绪(研究 1)和 23 种情绪(研究 2)分别采用 0 到 100 的评分标准。在研究 1 中,我们分析了来自 Twitter(总数 = 244)和 YouTube(总数 = 50)的英文帖子。情绪评分与基于文本的测量(LIWC、VADER、EmoLex、NRC-EIL、Emotionality)之间的关联证明了收敛性和辨别有效性。在研究 2 中,我们在波兰语(N = 3648)和立陶宛语(N = 1934)的 Facebook 多媒体帖子中测试了该方案的扩展版。虽然相关性低于英语,但 EmoLex 和 NRC-EIL 的收敛性和鉴别性模式仍然有效。不同样本间的编码器可靠性很高,研究 1 中 10 种不同情绪和研究 2 中 16 种不同情绪的类内相关系数均达到或超过 0.80。与之前的方案相比,这项研究改进了社交媒体中的情绪测量,纳入了更多的维度、多媒体和语境。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Social media emotions annotation guide (SMEmo): Development and initial validity.

The proper measurement of emotion is vital to understanding the relationship between emotional expression in social media and other factors, such as online information sharing. This work develops a standardized annotation scheme for quantifying emotions in social media using recent emotion theory and research. Human annotators assessed both social media posts and their own reactions to the posts' content on scales of 0 to 100 for each of 20 (Study 1) and 23 (Study 2) emotions. For Study 1, we analyzed English-language posts from Twitter (N = 244) and YouTube (N = 50). Associations between emotion ratings and text-based measures (LIWC, VADER, EmoLex, NRC-EIL, Emotionality) demonstrated convergent and discriminant validity. In Study 2, we tested an expanded version of the scheme in-country, in-language, on Polish (N = 3648) and Lithuanian (N = 1934) multimedia Facebook posts. While the correlations were lower than with English, patterns of convergent and discriminant validity with EmoLex and NRC-EIL still held. Coder reliability was strong across samples, with intraclass correlations of .80 or higher for 10 different emotions in Study 1 and 16 different emotions in Study 2. This research improves the measurement of emotions in social media to include more dimensions, multimedia, and context compared to prior schemes.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
10.30
自引率
9.30%
发文量
266
期刊介绍: Behavior Research Methods publishes articles concerned with the methods, techniques, and instrumentation of research in experimental psychology. The journal focuses particularly on the use of computer technology in psychological research. An annual special issue is devoted to this field.
期刊最新文献
Dissecting the components of error in analogue report tasks. A template and tutorial for preregistering studies using passive smartphone measures. Scoring story recall for individual differences research: Central details, peripheral details, and automated scoring. A tutorial: Analyzing eye and head movements in virtual reality. Behavioral science labs: How to solve the multi-user problem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1