Motor experience alters action perception through predictive learning of sensorimotor information

Jimmy Baraglia, Jorge Luis Copete, Y. Nagai, M. Asada
{"title":"Motor experience alters action perception through predictive learning of sensorimotor information","authors":"Jimmy Baraglia, Jorge Luis Copete, Y. Nagai, M. Asada","doi":"10.1109/DEVLRN.2015.7346116","DOIUrl":null,"url":null,"abstract":"Recent studies have revealed that infants' goal-directed action execution strongly alters their perception of similar actions performed by other individuals. Such an ability to recognize correspondences between self-experience and others' actions may be crucial for the development of higher cognitive social skills. However, there is not yet a computational model or constructive explanation accounting for the role of action generation in the perception of others' actions. We hypothesize that the sensory and motor information are integrated at a neural level through a predictive learning process. Thus, the experience of motor actions alters the representation of the sensorimotor integration, which causes changes in the perception of others' actions. To test this hypothesis, we built a computational model that integrates visual and motor (hereafter, visuomotor) information using a Recurrent Neural Network (RNN) which is capable of learning temporal sequences of data. We modeled the visual attention of the system based on a prediction error calculated as the difference between the predicted sensory values and the actual sensory values, which maximizes the attention toward not too predictable and not too unpredictable sensory information. We performed a series of experiments with a simulated humanoid robot. The experiments showed that the motor activation during self-generated actions biased the robot's perception of others' actions. These results highlight the important role of modalities integration in humans, which accounts for a biased perception of our environment based on a restricted repertoire of own experienced actions.","PeriodicalId":164756,"journal":{"name":"2015 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DEVLRN.2015.7346116","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

Abstract

Recent studies have revealed that infants' goal-directed action execution strongly alters their perception of similar actions performed by other individuals. Such an ability to recognize correspondences between self-experience and others' actions may be crucial for the development of higher cognitive social skills. However, there is not yet a computational model or constructive explanation accounting for the role of action generation in the perception of others' actions. We hypothesize that the sensory and motor information are integrated at a neural level through a predictive learning process. Thus, the experience of motor actions alters the representation of the sensorimotor integration, which causes changes in the perception of others' actions. To test this hypothesis, we built a computational model that integrates visual and motor (hereafter, visuomotor) information using a Recurrent Neural Network (RNN) which is capable of learning temporal sequences of data. We modeled the visual attention of the system based on a prediction error calculated as the difference between the predicted sensory values and the actual sensory values, which maximizes the attention toward not too predictable and not too unpredictable sensory information. We performed a series of experiments with a simulated humanoid robot. The experiments showed that the motor activation during self-generated actions biased the robot's perception of others' actions. These results highlight the important role of modalities integration in humans, which accounts for a biased perception of our environment based on a restricted repertoire of own experienced actions.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
运动经验通过对感觉运动信息的预测性学习改变动作知觉
最近的研究表明,婴儿目标导向的行动执行强烈地改变了他们对其他人所做的类似行动的感知。这种识别自我经验和他人行为之间对应关系的能力,可能对发展更高的认知社交技能至关重要。然而,目前还没有一个计算模型或建设性的解释来解释行动生成在感知他人行为中的作用。我们假设感觉和运动信息是通过预测学习过程在神经水平上整合的。因此,运动动作的体验改变了感觉运动整合的表征,从而导致对他人行为的感知发生变化。为了验证这一假设,我们建立了一个计算模型,该模型使用能够学习数据时间序列的递归神经网络(RNN)集成视觉和运动(以下简称视觉运动)信息。我们根据预测感官值与实际感官值之间的差异计算的预测误差对系统的视觉注意力进行建模,从而最大限度地提高对不太可预测和不太不可预测的感官信息的注意力。我们用一个模拟人形机器人做了一系列的实验。实验表明,自生成动作时的运动激活会使机器人对他人动作的感知产生偏差。这些结果强调了模式整合在人类中的重要作用,它解释了基于自己有限的经验行为对环境的偏见感知。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
The sequential organization of movement is critical to the development of reaching: A neural dynamics account Incremental grounded language learning in robot-robot interactions — Examples from spatial language A learning model for essentialist concepts Biological and simulated neuronal networks show similar competence on a visual tracking task A Deep Learning Neural Network for Number Cognition: A bi-cultural study with the iCub
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1