Behavioral and Physiological Responses to Visual Interest and Appraisals: Multimodal Analysis and Automatic Recognition

Q1 Computer Science Frontiers in ICT Pub Date : 2018-07-24 DOI:10.3389/fict.2018.00017
M. Soleymani, M. Mortillaro
{"title":"Behavioral and Physiological Responses to Visual Interest and Appraisals: Multimodal Analysis and Automatic Recognition","authors":"M. Soleymani, M. Mortillaro","doi":"10.3389/fict.2018.00017","DOIUrl":null,"url":null,"abstract":"Interest drives our focus of attention and plays an important role in social communication. Given its relevance for many activities (e.g., learning, entertainment) a system able to automatically detect someone's interest has several potential applications. In this paper, we analyze the physiological and behavioral patterns associated with visual interest and present a method for the automatic recognition of interest, curiosity and their most relevant appraisals, namely, coping potential, novelty and complexity. We conducted an experiment in which participants watched images and micro-videos while multimodal signals were recorded - facial expressions, galvanic skin response (GSR), and eye gaze. After watching each stimulus, participants self-reported their level of interest, curiosity, coping potential, perceived novelty, and complexity. Results showed that interest was associated with other facial Action Units than smiling when dynamics was taken into consideration, especially inner brow raiser and eye lid tightener. Longer saccades were also present when participants watched interesting stimuli. However, correlations of appraisals with specific facial Action Units and eye gaze were in general stronger than those we found for interest. We trained random forests regression models to detect the level of interest, curiosity, and appraisals from multimodal features. The recognition models - unimodal and multimodal - for appraisals generally outperformed those for interest, in particular for static images. In summary, our study suggests that automatic appraisal detection may be a suitable way to detect subtle emotions like interest for which prototypical expressions do not exist.","PeriodicalId":37157,"journal":{"name":"Frontiers in ICT","volume":"44 1","pages":"17"},"PeriodicalIF":0.0000,"publicationDate":"2018-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in ICT","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/fict.2018.00017","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 7

Abstract

Interest drives our focus of attention and plays an important role in social communication. Given its relevance for many activities (e.g., learning, entertainment) a system able to automatically detect someone's interest has several potential applications. In this paper, we analyze the physiological and behavioral patterns associated with visual interest and present a method for the automatic recognition of interest, curiosity and their most relevant appraisals, namely, coping potential, novelty and complexity. We conducted an experiment in which participants watched images and micro-videos while multimodal signals were recorded - facial expressions, galvanic skin response (GSR), and eye gaze. After watching each stimulus, participants self-reported their level of interest, curiosity, coping potential, perceived novelty, and complexity. Results showed that interest was associated with other facial Action Units than smiling when dynamics was taken into consideration, especially inner brow raiser and eye lid tightener. Longer saccades were also present when participants watched interesting stimuli. However, correlations of appraisals with specific facial Action Units and eye gaze were in general stronger than those we found for interest. We trained random forests regression models to detect the level of interest, curiosity, and appraisals from multimodal features. The recognition models - unimodal and multimodal - for appraisals generally outperformed those for interest, in particular for static images. In summary, our study suggests that automatic appraisal detection may be a suitable way to detect subtle emotions like interest for which prototypical expressions do not exist.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
对视觉兴趣和评价的行为和生理反应:多模态分析和自动识别
兴趣驱使我们集中注意力,在社会交往中起着重要作用。考虑到它与许多活动(例如,学习、娱乐)的相关性,一个能够自动检测某人兴趣的系统有几个潜在的应用。本文分析了与视觉兴趣相关的生理和行为模式,提出了一种自动识别兴趣、好奇心及其最相关的评价,即应对潜力、新颖性和复杂性的方法。我们进行了一项实验,让参与者观看图像和微视频,同时记录多模态信号——面部表情、皮肤电反应(GSR)和眼睛注视。在观看完每个刺激后,参与者自我报告他们的兴趣程度、好奇心、应对潜力、感知到的新奇程度和复杂性。结果表明,当考虑到动态因素时,兴趣与其他面部动作单元有关,而不是微笑,尤其是内眉抬高和眼睑收紧。当参与者观看有趣的刺激时,也会出现更长时间的扫视。然而,评价与特定面部动作单位和眼睛注视的相关性通常比我们发现的兴趣强。我们训练随机森林回归模型来检测多模态特征的兴趣、好奇心和评价水平。用于评估的识别模型-单模态和多模态-通常优于用于兴趣的识别模型,特别是用于静态图像的识别模型。总之,我们的研究表明,自动评估检测可能是一种合适的方法来检测微妙的情绪,如兴趣,而原型表达不存在。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Frontiers in ICT
Frontiers in ICT Computer Science-Computer Networks and Communications
自引率
0.00%
发文量
0
期刊最新文献
Project Westdrive: Unity City With Self-Driving Cars and Pedestrians for Virtual Reality Studies The Syncopated Energy Algorithm for Rendering Real-Time Tactile Interactions Dyadic Interference Leads to Area of Uncertainty During Face-to-Face Cooperative Interception Task Eyelid and Pupil Landmark Detection and Blink Estimation Based on Deformable Shape Models for Near-Field Infrared Video Toward Industry 4.0 With IoT: Optimizing Business Processes in an Evolving Manufacturing Factory
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1