在虚拟现实导航任务中,添加空间听觉线索可改善空间更新。

IF 1.7 4区 心理学 Q3 PSYCHOLOGY Attention Perception & Psychophysics Pub Date : 2024-05-09 DOI:10.3758/s13414-024-02890-x
Corey S. Shayman, Mirinda M. Whitaker, Erica Barhorst-Cates, Timothy E. Hullar, Jeanine K. Stefanucci, Sarah H. Creem-Regehr
{"title":"在虚拟现实导航任务中,添加空间听觉线索可改善空间更新。","authors":"Corey S. Shayman,&nbsp;Mirinda M. Whitaker,&nbsp;Erica Barhorst-Cates,&nbsp;Timothy E. Hullar,&nbsp;Jeanine K. Stefanucci,&nbsp;Sarah H. Creem-Regehr","doi":"10.3758/s13414-024-02890-x","DOIUrl":null,"url":null,"abstract":"<div><p>Auditory cues are integrated with vision and body-based self-motion cues for motion perception, balance, and gait, though limited research has evaluated their effectiveness for navigation. Here, we tested whether an auditory cue co-localized with a visual target could improve spatial updating in a virtual reality homing task. Participants navigated a triangular homing task with and without an easily localizable spatial audio signal co-located with the home location. The main outcome was unsigned angular error, defined as the absolute value of the difference between the participant’s turning response and the correct response towards the home location. Angular error was significantly reduced in the presence of spatial sound compared to a head-fixed identical auditory signal. Participants’ angular error was 22.79° in the presence of spatial audio and 30.09° in its absence. Those with the worst performance in the absence of spatial sound demonstrated the greatest improvement with the added sound cue. These results suggest that auditory cues may benefit navigation, particularly for those who demonstrated the highest level of spatial updating error in the absence of spatial sound.</p></div>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":"86 5","pages":"1473 - 1479"},"PeriodicalIF":1.7000,"publicationDate":"2024-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The addition of a spatial auditory cue improves spatial updating in a virtual reality navigation task\",\"authors\":\"Corey S. Shayman,&nbsp;Mirinda M. Whitaker,&nbsp;Erica Barhorst-Cates,&nbsp;Timothy E. Hullar,&nbsp;Jeanine K. Stefanucci,&nbsp;Sarah H. Creem-Regehr\",\"doi\":\"10.3758/s13414-024-02890-x\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Auditory cues are integrated with vision and body-based self-motion cues for motion perception, balance, and gait, though limited research has evaluated their effectiveness for navigation. Here, we tested whether an auditory cue co-localized with a visual target could improve spatial updating in a virtual reality homing task. Participants navigated a triangular homing task with and without an easily localizable spatial audio signal co-located with the home location. The main outcome was unsigned angular error, defined as the absolute value of the difference between the participant’s turning response and the correct response towards the home location. Angular error was significantly reduced in the presence of spatial sound compared to a head-fixed identical auditory signal. Participants’ angular error was 22.79° in the presence of spatial audio and 30.09° in its absence. Those with the worst performance in the absence of spatial sound demonstrated the greatest improvement with the added sound cue. These results suggest that auditory cues may benefit navigation, particularly for those who demonstrated the highest level of spatial updating error in the absence of spatial sound.</p></div>\",\"PeriodicalId\":55433,\"journal\":{\"name\":\"Attention Perception & Psychophysics\",\"volume\":\"86 5\",\"pages\":\"1473 - 1479\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2024-05-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Attention Perception & Psychophysics\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://link.springer.com/article/10.3758/s13414-024-02890-x\",\"RegionNum\":4,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"PSYCHOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Attention Perception & Psychophysics","FirstCategoryId":"102","ListUrlMain":"https://link.springer.com/article/10.3758/s13414-024-02890-x","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PSYCHOLOGY","Score":null,"Total":0}
引用次数: 0

摘要

在运动感知、平衡和步态方面,听觉线索与视觉和基于身体的自我运动线索相结合,但对其导航效果的评估研究有限。在此,我们测试了听觉线索与视觉目标共同定位是否能改善虚拟现实归位任务中的空间更新。在有和没有与原点位置共同定位的易定位空间音频信号的情况下,受试者进行了三角归位任务导航。主要结果是无符号角度误差,即参与者转向原点的反应与正确反应之间差值的绝对值。与头部固定的相同听觉信号相比,在有空间声音的情况下,角度误差明显减少。在有空间声音的情况下,参与者的角度误差为 22.79°,而在没有空间声音的情况下为 30.09°。那些在没有空间声音时表现最差的人,在添加了声音提示后表现出了最大的改善。这些结果表明,听觉提示可能有利于导航,尤其是那些在没有空间声音时空间更新误差最大的人。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
The addition of a spatial auditory cue improves spatial updating in a virtual reality navigation task

Auditory cues are integrated with vision and body-based self-motion cues for motion perception, balance, and gait, though limited research has evaluated their effectiveness for navigation. Here, we tested whether an auditory cue co-localized with a visual target could improve spatial updating in a virtual reality homing task. Participants navigated a triangular homing task with and without an easily localizable spatial audio signal co-located with the home location. The main outcome was unsigned angular error, defined as the absolute value of the difference between the participant’s turning response and the correct response towards the home location. Angular error was significantly reduced in the presence of spatial sound compared to a head-fixed identical auditory signal. Participants’ angular error was 22.79° in the presence of spatial audio and 30.09° in its absence. Those with the worst performance in the absence of spatial sound demonstrated the greatest improvement with the added sound cue. These results suggest that auditory cues may benefit navigation, particularly for those who demonstrated the highest level of spatial updating error in the absence of spatial sound.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.60
自引率
17.60%
发文量
197
审稿时长
4-8 weeks
期刊介绍: The journal Attention, Perception, & Psychophysics is an official journal of the Psychonomic Society. It spans all areas of research in sensory processes, perception, attention, and psychophysics. Most articles published are reports of experimental work; the journal also presents theoretical, integrative, and evaluative reviews. Commentary on issues of importance to researchers appears in a special section of the journal. Founded in 1966 as Perception & Psychophysics, the journal assumed its present name in 2009.
期刊最新文献
Disentangling decision errors from action execution in mouse-tracking studies: The case of effect-based action control. Parafoveal N400 effects reveal that word skipping is associated with deeper lexical processing in the presence of context-driven expectations. Correction to: On the relationship between spatial attention and semantics in the context of a Stroop paradigm. Can the left hand benefit from being right? The influence of body side on perceived grasping ability. Gaze-action coupling, gaze-gesture coupling, and exogenous attraction of gaze in dyadic interactions.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1