Recovering Individual Emotional States from Sparse Ratings Using Collaborative Filtering

IF 2.1 Q2 PSYCHOLOGY Affective science Pub Date : 2022-11-19 DOI:10.1007/s42761-022-00161-2
Eshin Jolly, Max Farrens, Nathan Greenstein, Hedwig Eisenbarth, Marianne C. Reddan, Eric Andrews, Tor D. Wager, Luke J. Chang
{"title":"Recovering Individual Emotional States from Sparse Ratings Using Collaborative Filtering","authors":"Eshin Jolly,&nbsp;Max Farrens,&nbsp;Nathan Greenstein,&nbsp;Hedwig Eisenbarth,&nbsp;Marianne C. Reddan,&nbsp;Eric Andrews,&nbsp;Tor D. Wager,&nbsp;Luke J. Chang","doi":"10.1007/s42761-022-00161-2","DOIUrl":null,"url":null,"abstract":"<div><p>A fundamental challenge in emotion research is measuring feeling states with high granularity and temporal precision without disrupting the emotion generation process. Here we introduce and validate a new approach in which responses are sparsely sampled and the missing data are recovered using a computational technique known as <i>collaborative filtering</i> (CF). This approach leverages structured covariation across individual experiences and is available in <i>Neighbors</i>, an open-source Python toolbox. We validate our approach across three different experimental contexts by recovering dense individual ratings using only a small subset of the original data. In dataset 1, participants (<i>n</i>=316) separately rated 112 emotional images on 6 different discrete emotions. In dataset 2, participants (<i>n</i>=203) watched 8 short emotionally engaging autobiographical stories while simultaneously providing moment-by-moment ratings of the intensity of their affective experience. In dataset 3, participants (<i>n</i>=60) with distinct social preferences made 76 decisions about how much money to return in a hidden multiplier trust game. Across all experimental contexts, CF was able to accurately recover missing data and importantly outperformed mean and multivariate imputation, particularly in contexts with greater individual variability. This approach will enable new avenues for affective science research by allowing researchers to acquire high dimensional ratings from emotional experiences with minimal disruption to the emotion-generation process.</p></div>","PeriodicalId":72119,"journal":{"name":"Affective science","volume":"3 4","pages":"799 - 817"},"PeriodicalIF":2.1000,"publicationDate":"2022-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s42761-022-00161-2.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Affective science","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.1007/s42761-022-00161-2","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PSYCHOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

A fundamental challenge in emotion research is measuring feeling states with high granularity and temporal precision without disrupting the emotion generation process. Here we introduce and validate a new approach in which responses are sparsely sampled and the missing data are recovered using a computational technique known as collaborative filtering (CF). This approach leverages structured covariation across individual experiences and is available in Neighbors, an open-source Python toolbox. We validate our approach across three different experimental contexts by recovering dense individual ratings using only a small subset of the original data. In dataset 1, participants (n=316) separately rated 112 emotional images on 6 different discrete emotions. In dataset 2, participants (n=203) watched 8 short emotionally engaging autobiographical stories while simultaneously providing moment-by-moment ratings of the intensity of their affective experience. In dataset 3, participants (n=60) with distinct social preferences made 76 decisions about how much money to return in a hidden multiplier trust game. Across all experimental contexts, CF was able to accurately recover missing data and importantly outperformed mean and multivariate imputation, particularly in contexts with greater individual variability. This approach will enable new avenues for affective science research by allowing researchers to acquire high dimensional ratings from emotional experiences with minimal disruption to the emotion-generation process.

Abstract Image

Abstract Image

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用协同过滤从稀疏评分中恢复个体情绪状态
情绪研究的一个基本挑战是在不干扰情绪生成过程的情况下,以高粒度和时间精度测量感觉状态。在这里,我们介绍并验证了一种新的方法,其中稀疏地对响应进行采样,并使用一种称为协作过滤(CF)的计算技术来恢复丢失的数据。这种方法利用了个人经历中的结构化协变,并在开源Python工具箱Neighbors中提供。我们通过仅使用原始数据的一小部分来恢复密集的个人评级,在三种不同的实验环境中验证了我们的方法。在数据集1中,参与者(n=316)分别对6种不同离散情绪的112幅情绪图像进行了评分。在数据集2中,参与者(n=203)观看了8个简短的情感引人入胜的自传体故事,同时提供了他们情感体验强度的逐时评分。在数据集3中,具有不同社会偏好的参与者(n=60)在一个隐藏的乘数信任游戏中做出了76个关于回报多少钱的决定。在所有实验环境中,CF都能够准确地恢复缺失的数据,并且在很大程度上优于平均值和多变量插补,尤其是在个体变异性较大的环境中。这种方法将为情感科学研究开辟新的途径,使研究人员能够从情感体验中获得高维度的评分,而对情感生成过程的干扰最小。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
4.40
自引率
0.00%
发文量
0
期刊最新文献
Introduction to the Special Section Commentaries Affectivism and the Emotional Elephant: How a Componential Approach Can Reconcile Opposing Theories to Serve the Future of Affective Sciences A Developmental Psychobiologist’s Commentary on the Future of Affective Science Emotional Overshadowing: Pleasant and Unpleasant Cues Overshadow Neutral Cues in Human Associative Learning Emphasizing the Social in Social Emotion Regulation: A Call for Integration and Expansion
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1