Dogs Rely On Visual Cues Rather Than On Effector-Specific Movement Representations to Predict Human Action Targets.

Q1 Social Sciences Open Mind Pub Date : 2023-08-20 eCollection Date: 2023-01-01 DOI:10.1162/opmi_a_00096
Lucrezia Lonardo, Christoph J Völter, Claus Lamm, Ludwig Huber
{"title":"Dogs Rely On Visual Cues Rather Than On Effector-Specific Movement Representations to Predict Human Action Targets.","authors":"Lucrezia Lonardo, Christoph J Völter, Claus Lamm, Ludwig Huber","doi":"10.1162/opmi_a_00096","DOIUrl":null,"url":null,"abstract":"<p><p>The ability to predict others' actions is one of the main pillars of social cognition. We investigated the processes underlying this ability by pitting motor representations of the observed movements against visual familiarity. In two pre-registered eye-tracking experiments, we measured the gaze arrival times of 16 dogs (<i>Canis familiaris</i>) who observed videos of a human or a conspecific executing the same goal-directed actions. On the first trial, when the human agent performed human-typical movements outside dogs' specific motor repertoire, dogs' gaze arrived at the target object anticipatorily (i.e., before the human touched the target object). When the agent was a conspecific, dogs' gaze arrived to the target object reactively (i.e., upon or after touch). When the human agent performed unusual movements more closely related to the dogs' motor possibilities (e.g., crawling instead of walking), dogs' gaze arrival times were intermediate between the other two conditions. In a replication experiment, with slightly different stimuli, dogs' looks to the target object were neither significantly predictive nor reactive, irrespective of the agent. However, when including looks at the target object that were not preceded by looks to the agents, on average dogs looked anticipatorily and sooner at the human agent's action target than at the conspecific's. Looking times and pupil size analyses suggest that the dogs' attention was captured more by the dog agent. These results suggest that visual familiarity with the observed action and saliency of the agent had a stronger influence on the dogs' looking behaviour than effector-specific movement representations in anticipating action targets.</p>","PeriodicalId":32558,"journal":{"name":"Open Mind","volume":"7 ","pages":"588-607"},"PeriodicalIF":0.0000,"publicationDate":"2023-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10575556/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Open Mind","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1162/opmi_a_00096","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/1/1 0:00:00","PubModel":"eCollection","JCR":"Q1","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0

Abstract

The ability to predict others' actions is one of the main pillars of social cognition. We investigated the processes underlying this ability by pitting motor representations of the observed movements against visual familiarity. In two pre-registered eye-tracking experiments, we measured the gaze arrival times of 16 dogs (Canis familiaris) who observed videos of a human or a conspecific executing the same goal-directed actions. On the first trial, when the human agent performed human-typical movements outside dogs' specific motor repertoire, dogs' gaze arrived at the target object anticipatorily (i.e., before the human touched the target object). When the agent was a conspecific, dogs' gaze arrived to the target object reactively (i.e., upon or after touch). When the human agent performed unusual movements more closely related to the dogs' motor possibilities (e.g., crawling instead of walking), dogs' gaze arrival times were intermediate between the other two conditions. In a replication experiment, with slightly different stimuli, dogs' looks to the target object were neither significantly predictive nor reactive, irrespective of the agent. However, when including looks at the target object that were not preceded by looks to the agents, on average dogs looked anticipatorily and sooner at the human agent's action target than at the conspecific's. Looking times and pupil size analyses suggest that the dogs' attention was captured more by the dog agent. These results suggest that visual familiarity with the observed action and saliency of the agent had a stronger influence on the dogs' looking behaviour than effector-specific movement representations in anticipating action targets.

Abstract Image

Abstract Image

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
狗依靠视觉提示而不是效应器特定的运动表现来预测人类的动作目标。
预测他人行为的能力是社会认知的主要支柱之一。我们通过将观察到的动作的运动表征与视觉熟悉度进行对比,研究了这种能力的基本过程。在两个预先注册的眼动追踪实验中,我们测量了16只狗(家庭犬)的凝视到达时间,这些狗观察到了一个人或同种狗执行相同目标导向动作的视频。在第一次试验中,当人类代理人在狗的特定运动曲目之外进行人类典型的动作时,狗的目光会提前到达目标物体(即,在人类触摸目标物体之前)。当代理人是同种时,狗的目光会反应性地(即在触摸时或触摸后)到达目标物体。当人类代理人进行与狗的运动可能性更密切相关的异常动作时(例如,爬行而不是行走),狗的凝视到达时间介于其他两种情况之间。在一项复制实验中,在刺激略有不同的情况下,狗对目标物体的表情既没有显著的预测性,也没有反应性,无论是什么试剂。然而,当包括对目标物体的观察,而这些观察之前没有对代理人的观察时,平均而言,狗会提前、更快地观察人类代理人的行动目标。观察时间和瞳孔大小分析表明,狗的注意力更多地被狗的代理人所吸引。这些结果表明,在预测动作目标时,对观察到的动作的视觉熟悉度和药剂的显著性对狗的视觉行为的影响比效应器特定的运动表征更强。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Open Mind
Open Mind Social Sciences-Linguistics and Language
CiteScore
3.20
自引率
0.00%
发文量
15
审稿时长
53 weeks
期刊最新文献
Approximating Human-Level 3D Visual Inferences With Deep Neural Networks. Prosodic Cues Support Inferences About the Question's Pedagogical Intent. The Double Standard of Ownership. Combination and Differentiation Theories of Categorization: A Comparison Using Participants' Categorization Descriptions. Investigating Sensitivity to Shared Information and Personal Experience in Children's Use of Majority Information.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1