日常数字生活的情感动力学:开放的计算可能性。

IF 2.1 Q2 PSYCHOLOGY Affective science Pub Date : 2023-08-07 DOI:10.1007/s42761-023-00202-4
Maia L. Rocklin, Anna Angelina Garròn Torres, Byron Reeves, Thomas N. Robinson, Nilam Ram
{"title":"日常数字生活的情感动力学:开放的计算可能性。","authors":"Maia L. Rocklin,&nbsp;Anna Angelina Garròn Torres,&nbsp;Byron Reeves,&nbsp;Thomas N. Robinson,&nbsp;Nilam Ram","doi":"10.1007/s42761-023-00202-4","DOIUrl":null,"url":null,"abstract":"<div><p>Up to now, there was no way to observe and track the affective impacts of the massive amount of complex visual stimuli that people encounter “in the wild” during their many hours of digital life. In this paper, we propose and illustrate how recent advances in AI—trained ensembles of deep neural networks—can be deployed on new data streams that are long sequences of screenshots of study participants’ smartphones obtained unobtrusively during everyday life. We obtained affective valence and arousal ratings of hundreds of images drawn from existing picture repositories often used in psychological studies, and a new screenshot repository chronicling individuals’ everyday digital life from both <i>N</i> = 832 adults and an affect computation model (Parry &amp; Vuong, 2021). Results and analysis suggest that (a) our sample rates images similarly to other samples used in psychological studies, (b) the affect computation model is able to assign valence and arousal ratings similarly to humans, and (c) the resulting computational pipeline can be deployed at scale to obtain detailed maps of the affective space individuals travel through on their smartphones. Leveraging innovative methods for tracking the emotional content individuals encounter on their smartphones, we open the possibility for large-scale studies of how the affective dynamics of everyday digital life shape individuals’ moment-to-moment experiences and well-being.</p></div>","PeriodicalId":72119,"journal":{"name":"Affective science","volume":"4 3","pages":"529 - 540"},"PeriodicalIF":2.1000,"publicationDate":"2023-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s42761-023-00202-4.pdf","citationCount":"1","resultStr":"{\"title\":\"The Affective Dynamics of Everyday Digital Life: Opening Computational Possibility\",\"authors\":\"Maia L. Rocklin,&nbsp;Anna Angelina Garròn Torres,&nbsp;Byron Reeves,&nbsp;Thomas N. Robinson,&nbsp;Nilam Ram\",\"doi\":\"10.1007/s42761-023-00202-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Up to now, there was no way to observe and track the affective impacts of the massive amount of complex visual stimuli that people encounter “in the wild” during their many hours of digital life. In this paper, we propose and illustrate how recent advances in AI—trained ensembles of deep neural networks—can be deployed on new data streams that are long sequences of screenshots of study participants’ smartphones obtained unobtrusively during everyday life. We obtained affective valence and arousal ratings of hundreds of images drawn from existing picture repositories often used in psychological studies, and a new screenshot repository chronicling individuals’ everyday digital life from both <i>N</i> = 832 adults and an affect computation model (Parry &amp; Vuong, 2021). Results and analysis suggest that (a) our sample rates images similarly to other samples used in psychological studies, (b) the affect computation model is able to assign valence and arousal ratings similarly to humans, and (c) the resulting computational pipeline can be deployed at scale to obtain detailed maps of the affective space individuals travel through on their smartphones. Leveraging innovative methods for tracking the emotional content individuals encounter on their smartphones, we open the possibility for large-scale studies of how the affective dynamics of everyday digital life shape individuals’ moment-to-moment experiences and well-being.</p></div>\",\"PeriodicalId\":72119,\"journal\":{\"name\":\"Affective science\",\"volume\":\"4 3\",\"pages\":\"529 - 540\"},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2023-08-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://link.springer.com/content/pdf/10.1007/s42761-023-00202-4.pdf\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Affective science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s42761-023-00202-4\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"PSYCHOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Affective science","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.1007/s42761-023-00202-4","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PSYCHOLOGY","Score":null,"Total":0}
引用次数: 1

摘要

到目前为止,还没有办法观察和跟踪人们在数字生活的许多小时里“在野外”遇到的大量复杂视觉刺激的情感影响。在本文中,我们提出并说明了如何将人工智能训练的深度神经网络集合的最新进展部署在新的数据流上,这些数据流是研究参与者在日常生活中不引人注目地获得的智能手机的长序列截图。我们获得了数百张图片的情感效价和唤醒评级,这些图片来自心理学研究中常用的现有图片库,以及一个新的截图库,记录了832名成年人的日常数字生活和一个情感计算模型(Parry&Vuong,2021)。结果和分析表明,(a)我们的样本对图像的评分与心理学研究中使用的其他样本相似,(b)情感计算模型能够与人类相似地分配效价和唤醒评级,(c)由此产生的计算管道可以大规模部署,以获得个人在智能手机上所经历的情感空间的详细地图。利用创新的方法来跟踪个人在智能手机上遇到的情感内容,我们为大规模研究日常数字生活的情感动态如何塑造个人的即时体验和幸福开辟了可能性。补充信息:在线版本包含补充材料,可访问10.1007/s42761-023-00202-4。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
The Affective Dynamics of Everyday Digital Life: Opening Computational Possibility

Up to now, there was no way to observe and track the affective impacts of the massive amount of complex visual stimuli that people encounter “in the wild” during their many hours of digital life. In this paper, we propose and illustrate how recent advances in AI—trained ensembles of deep neural networks—can be deployed on new data streams that are long sequences of screenshots of study participants’ smartphones obtained unobtrusively during everyday life. We obtained affective valence and arousal ratings of hundreds of images drawn from existing picture repositories often used in psychological studies, and a new screenshot repository chronicling individuals’ everyday digital life from both N = 832 adults and an affect computation model (Parry & Vuong, 2021). Results and analysis suggest that (a) our sample rates images similarly to other samples used in psychological studies, (b) the affect computation model is able to assign valence and arousal ratings similarly to humans, and (c) the resulting computational pipeline can be deployed at scale to obtain detailed maps of the affective space individuals travel through on their smartphones. Leveraging innovative methods for tracking the emotional content individuals encounter on their smartphones, we open the possibility for large-scale studies of how the affective dynamics of everyday digital life shape individuals’ moment-to-moment experiences and well-being.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
4.40
自引率
0.00%
发文量
0
期刊最新文献
Introduction to the Special Section Commentaries Affectivism and the Emotional Elephant: How a Componential Approach Can Reconcile Opposing Theories to Serve the Future of Affective Sciences A Developmental Psychobiologist’s Commentary on the Future of Affective Science Emotional Overshadowing: Pleasant and Unpleasant Cues Overshadow Neutral Cues in Human Associative Learning Emphasizing the Social in Social Emotion Regulation: A Call for Integration and Expansion
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1